Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
CSQ326R201 Mission At Databricks, we are on a mission to empower our customers to solve the world's toughest data problems by utilising the Data Intelligence platform. As a Scale Solution Engineer, you will play a critical role in advising Customers in their onboarding journey. You will directly work with customers to help them onboard and deploy Databricks in their Production environment. The Impact You Will Have You will ensure new customers have an excellent experience by providing them with technical assistance early in their journey. You will become an expert on the Databricks Platform and guide customers in making the best technical decisions to achieve their goals. You will work on multiple tactical customers to track and report their progress. What We Look For 2+ years of industry experience Early-career technical professional ideally in data-driven or cloud-based roles. Knowledge of at least one of the public cloud platforms AWS, Azure, or GCP is required. Knowledge of a programming language - Python, Scala, or SQL Knowledge of end-to-end data analytics workflow Hands-on professional or academic experience in one or more of the following: Data Engineering technologies (e.g., ETL, DBT, Spark, Airflow) Data Warehousing technologies (e.g., SQL, Stored Procedures, Redshift, Snowflake) Excellent time management & presentation skills Bonus - Knowledge of Data Science and Machine Learning (e.g., build and deploy ML Models) About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks. Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.
Posted 1 week ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
P-926 At Databricks, we are passionate about enabling data teams to solve the world's toughest problems — from making the next mode of transportation a reality to accelerating the development of medical breakthroughs. We do this by building and running the world's best data and AI infrastructure platform so our customers can use deep data insights to improve their business. Founded by engineers — and customer obsessed — we leap at every opportunity to solve technical challenges, from designing next-gen UI/UX for interfacing with data to scaling our services and infrastructure across millions of virtual machines. Databricks Mosaic AI offers a unique data-centric approach to building enterprise-quality, Machine Learning and Generative AI solutions, enabling organizations to securely and cost-effectively own and host ML and Generative AI models, augmented or trained with their enterprise data. And we're only getting started in Bengaluru , India - and currently in the process of setting up 10 new teams from scratch ! As a Senior Software Engineer at Databricks India, you can get to work across : Backend DDS (Distributed Data Systems) Full Stack The Impact You'll Have Our Backend teams span many domains across our essential service platforms. For instance, you might work on challenges such as: Problems that span from product to infrastructure including: distributed systems, at-scale service architecture and monitoring, workflow orchestration, and developer experience. Deliver reliable and high performance services and client libraries for storing and accessing humongous amount of data on cloud storage backends, e.g., AWS S3, Azure Blob Store. Build reliable, scalable services, e.g. Scala, Kubernetes, and data pipelines, e.g. Apache Spark™, Databricks, to power the pricing infrastructure that serves millions of cluster-hours per day and develop product features that empower customers to easily view and control platform usage. Our DDS team spans across: Apache Spark™ Data Plane Storage Delta Lake Delta Pipelines Performance Engineering As a Full Stack software engineer, you will work closely with your team and product management to bring that delight through great user experience. What We Look For BS (or higher) in Computer Science, or a related field 7+ years of production level experience in one of: Python, Java, Scala, C++, or similar language. Experience developing large-scale distributed systems from scratch Experience working on a SaaS platform or with Service-Oriented Architectures. About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks. Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.
Posted 1 week ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
GAQ225R106 We are seeking an experienced and detail-oriented Accounting Manager to lead our accounting operations in India. This role will oversee financial reporting, compliance, and process optimization while building a strong accounting team to support our international operations. The ideal candidate will have a deep understanding of Indian accounting regulations, US GAAP, and corporate compliance requirements. The role will report to the Director of International Accounting. The Impact You Will Have Oversee all monthly and quarterly accounting processes for the Indian subsidiary, ensuring accuracy and timeliness Manage compliance requirements, including statutory filings and tax regulations while maintaining strong vendor relationships Build and lead an accounting team in India to support international financial operations and ensure seamless coordination with global finance teams Ensure strict adherence to company group accounting policies and the correct application of US GAAP Support external audit requirements by providing accurate financial data and ensuring compliance in assigned areas of responsibility Conduct financial statement analysis, identifying key fluctuations and providing meaningful insights to assist management in decision-making Implement best practices for financial efficiency, process automation, and internal controls Collaborate with cross-functional teams to enhance financial reporting, analysis, and business performance insights Oversee and participate in quarterly and annual audits, working closely with external auditors to ensure compliance and accuracy Drive ad hoc financial projects as needed to support the company’s growth and strategic initiatives What We Look For Bachelor’s or Master’s degree in Accounting, Finance, or a related field Professional qualification (CA, ACCA, ICAEW or similar) Overall 10+ years of experience along with 8+ years of operational accounting experience 3+ years of experience managing a team Operational accounting experience in a growing SAAS technology business Excellent organizational and time management skills Strong knowledge of and experience with tools such as Netsuite, FloQast, and Coupa Team player with excellent communication skills and a desire for innovation Ability to build relationships across organizations Strong knowledge of Indian accounting standards, tax laws, and compliance regulations Experience with US GAAP Detail-oriented with a desire for accuracy, analytics, and a stellar customer service approach About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks. Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.
Posted 1 week ago
7.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Tech Lead - Modern Data Job ID: POS-12081 Primary Skill: Databricks, ADF Location: Hyderabad Experience: 10.00 Secondary skills: Advanced SQL, Azure Databricks, PySpark Azure Data Factory, and Azure Datalake. Mode of work: Work from Office Experience : 7-10 Years About The Job We are seeking a Tech Lead – Databricks Data Engineer with experience in designing and developing data pipelines using Azure Databricks, Data Factory, and Datalake. The role involves managing large volumes of data, building complex ETL solutions, and working closely with business teams to deliver robust data transformations and analytics solutions. Know Your Team At ValueMomentum’s Engineering Center , we are a team of passionate engineers who thrive on tackling complex business challenges with innovative solutions while transforming the P&C insurance value chain. We achieve this through strong engineering foundation and continuously refining our processes, methodologies, tools, agile delivery teams, and core engineering archetypes. Our core expertise lies in six key areas: Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise. Join a team that invests in your growth. Our Infinity Program empowers you to build your career with role-specific skill development leveraging immersive learning platforms. You'll have the opportunity to showcase your talents by contributing to impactful projects. Responsibilities Design and develop ETL pipelines using ADF for data ingestion and transformation. Collaborate with Azure stack modules like Data Lakes and SQL DW to build robust data solutions. Write SQL, Python, and PySpark code for efficient data processing and transformation. Understand and translate business requirements into technical designs. Develop mapping documents and transformation rules as per project scope. Communicate project status with stakeholders, ensuring smooth project execution. Requirements 7-10 years of experience in data ingestion, data processing, and analytical pipelines for big data and relational databases. Hands-on experience with Azure services: ADLS, Azure Databricks, Data Factory, Synapse, Azure SQL DB. Experience in SQL, Python, and PySpark for data transformation and processing. Familiarity with DevOps and CI/CD deployments. Strong communication skills and attention to detail in high-pressure situations. Experience in the insurance or financial industry is preferred. About The Company ValueMomentum is a leading solutions provider for the global property and casualty insurance industry. The company helps insurers stay ahead with sustained growth and high performance, enhancing stakeholder value and fostering resilient societies. Having served over 100 insurers, ValueMomentum is one of the largest services providers exclusively focused on the insurance industry. Benefits We at ValueMomentum offer you a congenial environment to work and grow in the company of experienced professionals. Some benefits that are available to you are: Competitive compensation package. Career Advancement: Individual Career Development, coaching and mentoring programs for professional and leadership skill development. Comprehensive training and certification programs. Performance Management: Goal Setting, continuous feedback and year-end appraisal. Reward & recognition for the extraordinary performers.
Posted 1 week ago
4.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Tech Lead Job ID: POS-13736 Primary Skill: Databricks, ADF Location: Hyderabad Experience: 8.00 Secondary skills: Advanced SQL, Azure Databricks, PySpark Azure Data Factory, Azure Datalake, and Azure Synapse Mode of Work: Work from Office Experience : 4 to 10 Years About The Job We are seeking a Databricks Data Engineer/lead with experience in designing and developing data pipelines using Azure Databricks, Data Factory, and Data Lake. The role involves managing large volumes of data, building complex ETL solutions, and working closely with business teams to deliver robust data transformations and analytics solutions. Know Your Team At ValueMomentum’s Engineering Center , we are a team of passionate engineers who thrive on tackling complex business challenges with innovative solutions while transforming the P&C insurance value chain. We achieve this through a strong engineering foundation and by continuously refining our processes, methodologies, tools, agile delivery teams, and core engineering archetypes. Our core expertise lies in six key areas: Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise. Join a team that invests in your growth. Our Infinity Program empowers you to build your career with role-specific skill development, leveraging immersive learning platforms. You'll have the opportunity to showcase your talents by contributing to impactful projects. Responsibilities Design and develop ETL pipelines using ADF for data ingestion and transformation. Collaborate with Azure stack modules like Data Lakes and SQL DW to build robust data solutions. Write SQL, Python, and PySpark code for efficient data processing and transformation. Understand and translate business requirements into technical designs. Develop mapping documents and transformation rules as per project scope. Communicate project status with stakeholders, ensuring smooth project execution. Requirements 4-10 years of experience in data ingestion, data processing, and analytical pipelines for big data and relational databases. Hands-on experience with Azure services: ADLS, Azure Databricks, Data Factory, Synapse, Azure SQL DB. Experience in SQL, Python, and PySpark for data transformation and processing. Familiarity with DevOps and CI/CD deployments. Strong communication skills and attention to detail in high-pressure situations. Experience in the insurance or financial industry is preferred. About The Company ValueMomentum is a leading solutions provider for the global property and casualty insurance industry. The company helps insurers stay ahead with sustained growth and high performance, enhancing stakeholder value and fostering resilient societies. Having served over 100 insurers, ValueMomentum is one of the largest services providers exclusively focused on the insurance industry. Benefits We at ValueMomentum offer you a congenial environment to work and grow in the company of experienced professionals. Some benefits that are available to you are: Competitive compensation package. Career Advancement: Individual Career Development, coaching and mentoring programs for professional and leadership skill development. Comprehensive training and certification programs. Performance Management: Goal Setting, continuous feedback and year-end appraisal. Reward & recognition for the extraordinary performers.
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Senior Software Engineer - Modern Data Job ID: POS-12094 Primary Skill: Databricks, ADF Location: Hyderabad Experience: 7.00 Secondary skills: IDMC/IICS and SQL Mode of work: Work from Office Experience : 4-6 Years About The Job We are looking for a IDMC developer to join our Data leverage team – a team of high-energy individuals who thrive in a rapid-pace and agile product development environment. As a Developer, you will provide accountability in the ETL and Data Integration space, from the development phase through delivery. You will work closely with the Project Manager, Technical Lead, and client teams. Your prime responsibilities will be to develop bug free code with proper unit testing and documentation. You will provide inputs to planning, estimation, scheduling, and coordination of technical activities related to ETL-based applications. You will be responsible for meeting development schedules and delivering high-quality ETL-based solutions that meet technical specifications and design requirements ensuring customer satisfaction. You are expected to possess good knowledge in one or more of these ETL tools – IDMC, Informatica. Know Your Team At ValueMomentum’s Engineering Center, we are a team of passionate engineers who thrive on tackling complex business challenges with innovative solutions while transforming the P&C insurance value chain. We achieve this through strong engineering foundation and continuously refining our processes, methodologies, tools, agile delivery teams, and core engineering archetypes. Our core expertise lies in six key areas: Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise. Join a team that invests in your growth. Our Infinity Program empowers you to build your career with role-specific skill development leveraging immersive learning platforms. You'll have the opportunity to showcase your talents by contributing to impactful projects. Responsibilities Design, develop, and optimize data pipelines and ETL processes using Informatica Intelligent Data Management Cloud (IDMC) tools to ingest, transform, and store data. Collaborate with cross-functional teams to integrate data from multiple sources into the IDMC platform and ensure smooth data flow. Build scalable data models, manage data warehousing solutions, and optimize data storage within the IDMC framework Develop data transformation, integration, and quality checks, using Informatica or other IDMC tools. Perform data cleansing and data quality checks to ensure clean, reliable data for business users. Troubleshoot and resolve issues related to data pipelines, ensuring reliability and stability. Collaborate with cross-functional teams to understand business requirements and deliver solutions. Ensure the quality and cleanliness of data using data validation and transformation techniques. Requirements Bachelor’s degree in computer science, Engineering, or a related field. Strong experience in ETL tools – IDMC (CDI and CAI Primary), Informatica (Secondary) Good understanding of ETL and ELT best practices and experience with ETL and ETL Mapping documentation. Strong proficiency in SQL for working with data and writing complex queries. Experience in Agile or DevOps practices for data management projects. Strong problem-solving skills and attention to detail. Excellent communication and teamwork skills. Experience in the insurance (e.g., UW, Claim, Policy Issuance) or financial industry preferred. About The Company ValueMomentum is a leading solutions provider for the global property and casualty insurance industry. The company helps insurers stay ahead with sustained growth and high performance, enhancing stakeholder value and fostering resilient societies. Having served over 100 insurers, ValueMomentum is one of the largest services providers exclusively focused on the insurance industry. Benefits We at ValueMomentum offer you a congenial environment to work and grow in the company of experienced professionals. Some benefits that are available to you are: Competitive compensation package. Career Advancement: Individual Career Development, coaching and mentoring programs for professional and leadership skill development. Comprehensive training and certification programs. Performance Management: Goal Setting, continuous feedback and year-end appraisal. Reward & recognition for the extraordinary performers.
Posted 1 week ago
5.0 - 11.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As a Network & Svcs Operation Specialist at Accenture, you will play a crucial role in the Network Billing Operations - Problem Management domain. With 7 to 11 years of experience and a background in Any Graduation, you will be responsible for handling various tasks related to Wireless Telecommunication products and services billing. Your expertise will ensure the accuracy and quality of data associated with billing processes, requiring knowledge in Finance for Telecommunication Mobility services. Your primary responsibilities will include collecting, storing, and organizing data, as well as developing and implementing Data Analysis to identify anomalies and trends that could lead to potential billing issues. You will work on managing problems arising from IT infrastructure errors to minimize their impact on business operations by identifying root causes and taking corrective actions promptly. To excel in this role, you should possess 5 years of advanced programming skills, particularly in SQL scripts, Python, and PySpark. Experience with Databricks and Palantir will be advantageous. Additionally, you must be self-motivated, have a desire to learn and understand data models and billing processes, exhibit critical thinking skills, and be proficient in reporting and metrics with strong numerical abilities. Your role will also involve managing expenses, billing, financial aspects, and system processes efficiently. Good organizational skills, self-discipline, a systematic approach, and excellent interpersonal skills are essential for success in this position. You should have an analytical mindset, be a problem solver, and demonstrate flexibility in adapting to changing scenarios. Overall, your knowledge of Telecom Products and Services will be a valuable asset in fulfilling your roles and responsibilities effectively. Join Accenture, a global professional services company, and be a part of our team dedicated to leveraging technology and human ingenuity to drive transformation and deliver value to clients worldwide. Visit www.accenture.com to learn more about our innovative services and solutions.,
Posted 1 week ago
0.0 - 3.0 years
0 Lacs
kochi, kerala
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity: As part of EYTS, your work will be to implement a data integration and reporting solution using ETL technology offerings. Your key responsibilities: - Learn and adapt to the ETL technology landscape built on top of Microsoft Azure Data Services. - Ability to create complex SQL queries, including dynamic queries to process data. - Convert business and technical requirements into an appropriate technical solution and implement features using Azure Data Factory, Databricks, Azure Data Lake Store. - Responsibly own project tasks and take them through completion. - Maintain effective and consistent communication within the team, with peers, the leadership team, and peers in other IT groups. - Produce high-quality deliverables following the project timeline. To qualify for the role, you must have: - B.E/ B.Tech/ MCA/ MS or an equivalent degree in Computer Science discipline. - Preferably 3 - 6 months of experience as a software developer. - Deep knowledge of Database concepts and the ability to write complex SQL queries. - Knowledge of Microsoft Azure and its Data Lake related services. - Sound analytical and problem-solving skills needed to manage multiple technical challenges. - Ability to work independently and with others. - Extremely organized with strong time-management skills. - Go-getter with very strong interpersonal skills. - Strong verbal and written communication skills. - Must be an outstanding team player. - Ability to manage and prioritize workload. - Ability to work in a fast-paced environment. - Quick learner with a can-do attitude. - Must be flexible and able to quickly and positively adapt to change. Ideally, you'll also have: - Knowledge in any business application development platform, preferably Angular + Web API. - Knowledge of PowerPlatform - PowerBI, PowerApps, and PowerAutomate. - Knowledge of PMI & Agile Standards. - Industry-recognized certifications in Azure offerings would be a plus. What we look for: As an entry ETL developer, we're looking for someone who has the knowledge and attitude to become productive through our Hire-Train-Deploy process. We expect the candidate to acquire skills to convert product/feature designs to functioning components with quality on time by architectural standards/principles, global product-specific guidelines, usability design standards, etc. He/She must also apply judgment in implementing Application Engineering methodologies, processes, and practices to specific requirements of projects/programs, which may include product design engineering, information security, code maintainability, and reliability. What working at EY offers: EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations - Argentina, China, India, the Philippines, Poland & the UK - and with teams from all EY service lines, geographies & sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We'll introduce you to an ever-expanding ecosystem of people, learning, skills & insights that will stay with you throughout your career. - Continuous learning: You'll develop the mindset and skills to navigate whatever comes next. - Success as defined by you: We'll provide the tools and flexibility so you can make a meaningful impact, your way. - Transformative leadership: We'll give you the insights, coaching, and confidence to be the leader the world needs. - Diverse and inclusive culture: You'll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world: EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
ahmedabad, gujarat
On-site
The Sr. Data Analytics Engineer at Ajmera Infotech plays a pivotal role in powering mission-critical decisions with governed insights for NYSE-listed clients. Being part of a 120-engineer team specializing in highly regulated domains such as HIPAA, FDA, and SOC 2, you will be instrumental in delivering production-grade systems that transform data into a strategic advantage. You will have the opportunity to make end-to-end impact by building full-stack analytics solutions, from lake house pipelines to real-time dashboards. The fail-safe engineering practices followed at Ajmera Infotech include TDD, CI/CD, DAX optimization, Unity Catalog, and cluster tuning. Working with a modern stack comprising Databricks, PySpark, Delta Lake, Power BI, and Airflow, you will be immersed in cutting-edge technologies. As a Sr. Data Analytics Engineer, you will foster a mentorship culture by leading code reviews, sharing best practices, and growing as a domain expert. Your role will involve helping enterprises migrate legacy analytics into cloud-native, governed platforms while maintaining a compliance-first mindset in HIPAA-aligned environments. Key Responsibilities: - Building scalable pipelines using SQL, PySpark, and Delta Live Tables on Databricks. - Orchestrating workflows with Databricks Workflows or Airflow, including implementing SLA-backed retries and alerting. - Designing dimensional models (star/snowflake) with Unity Catalog and Great Expectations validation. - Delivering robust Power BI solutions including dashboards, semantic layers, and paginated reports with a focus on DAX optimization. - Migrating legacy SSRS reports to Power BI seamlessly without any loss of logic or governance. - Optimizing compute and cost efficiency through cache tuning, partitioning, and capacity monitoring. - Documenting pipeline logic, RLS rules, and more in Git-controlled formats. - Collaborating cross-functionally to convert product analytics needs into resilient BI assets. - Championing mentorship by reviewing notebooks, dashboards, and sharing platform standards. Must-Have Skills: - 5+ years of experience in analytics engineering, with a minimum of 3 years in production Databricks/Spark contexts. - Advanced SQL skills, including windowing functions, expert PySpark, Delta Lake, and Unity Catalog proficiency. - Mastery in Power BI, covering DAX optimization, security rules, paginated reports, and more. - Experience in SSRS-to-Power BI migration with a focus on replicating RDL logic accurately. - Strong familiarity with Git, CI/CD practices, and cloud platforms such as Azure or AWS. - Strong communication skills to effectively bridge technical and business audiences. Nice-to-Have Skills: - Databricks Data Engineer Associate certification. - Experience with streaming pipelines (Kafka, Structured Streaming). - Knowledge of data quality frameworks like dbt, Great Expectations, or similar tools. - Exposure to BI platforms like Tableau, Looker, or similar tools. - Understanding of cost governance aspects such as Power BI Premium capacity and Databricks chargeback mechanisms. Ajmera Infotech offers competitive compensation, flexible hybrid schedules, and a deeply technical culture that empowers engineers to lead the narrative. If you are passionate about building reliable, audit-ready data products and want to take ownership of systems from raw data ingestion to KPI dashboards, apply now to engineer insights that truly matter.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
You will be responsible for developing applications using various Microsoft and web development technologies such as ASP.Net, C#, MVC, Web Forms, Angular, SQL Server, T-SQL, and Microservices. Your expertise in big data technologies like Hadoop, Spark, Hive, Python, Databricks, etc. will be crucial for this role. With a Bachelors Degree in Computer Science or equivalent experience through higher education, you should have at least 8 years of experience in Data Engineering and/or Software Engineering. Your strong coding skills along with knowledge of infrastructure as code and automating production data and ML pipelines will be highly valued. You should be proficient in working on on-prem to cloud migration, particularly in Azure, and have hands-on experience with Azure PaaS offerings such as Synapse, ADLS, DataBricks, Event Hubs, CosmosDB, Azure ML, etc. Experience in building, governing, and scaling data warehouses/lakes/lake houses is essential for this role. Your expertise in developing and tuning stored procedures and T-SQL scripting in SQL Server, along with familiarity with various .Net development tools and products, will contribute significantly to the success of the projects. You should be adept with agile software development lifecycle and DevOps principles to ensure efficient project delivery.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
You will be joining a renowned global digital engineering firm, Srijan, a Material company, as a Senior Developer/ Lead specialized in Data Science. Your role will involve working on Generative AI models such as Azure OpenAI GPT and Multi-Agent System Architecture. You should be proficient in Python and AI/ML libraries like TensorFlow, PyTorch, Scikit-learn. Experience with frameworks like LangChain, AutoGen for multi-agent systems, and strong knowledge of Data Science techniques including data preprocessing, feature engineering, and model evaluation are essential. It is preferred that you have at least 4+ years of experience in a similar role. The job location can be either Gurgaon or Bangalore. Familiarity with Big Data tools such as Spark, Hadoop, Databricks, and databases like SQL, NoSQL will be beneficial. Additionally, expertise in ReactJS for building responsive and interactive user interfaces is a plus. In this role, you can expect professional development and mentorship, a hybrid work mode with a remote-friendly workplace, health and family insurance, 40+ leaves per year including maternity and paternity leaves, as well as access to wellness, meditation, and counseling sessions.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Engineer, you will be responsible for designing, developing, and maintaining robust ETL pipelines using Azure Data Factory (ADF) to support complex insurance data workflows. You will integrate and extract data from various Guidewire modules (PolicyCenter, BillingCenter, ClaimCenter) to ensure data quality, integrity, and consistency. Building reusable components for data ingestion, transformation, and orchestration across Guidewire and Azure ecosystems will be a key part of your role. Your responsibilities will also include optimizing ADF pipelines for performance, scalability, and cost-efficiency while following industry-standard DevOps and CI/CD practices. Collaborating with solution architects, data modelers, and Guidewire functional teams to translate business requirements into scalable ETL solutions will be crucial. You will conduct thorough unit testing, data validation, and error handling across all data transformation steps and participate in end-to-end data lifecycle management. Providing technical documentation, pipeline monitoring dashboards, and ensuring production readiness will be part of your responsibilities. You will support data migration projects involving legacy platforms to Azure cloud environments and follow Agile/Scrum practices, contributing to sprint planning, retrospectives, and stand-ups with strong ownership of deliverables. **Mandatory Skills:** - 6+ years of experience in data engineering with expertise in Azure Data Factory, Azure SQL, and related Azure services. - Hands-on experience in building ADF pipelines integrating with Guidewire Insurance Suite. - Proficiency in data transformation using SQL, Stored Procedures, and Data Flows. - Experience working on Guidewire data models and understanding of PC/Billing/Claim schema and business entities. - Strong understanding of cloud-based data warehousing concepts, data lake patterns, and data governance best practices. - Clear experience in integrating Guidewire systems with downstream reporting and analytics platforms. - Excellent debugging skills to resolve complex data transformation and pipeline performance issues. **Preferred Skills:** - Prior experience in the Insurance (P&C preferred) domain or implementing Guidewire DataHub and/or InfoCenter. - Familiarity with Power BI, Databricks, or Synapse Analytics. - Working knowledge of Git-based source control, CI/CD pipelines, and deployment automation. **Additional Requirements:** - Work Mode: 100% Onsite at Hyderabad office (No remote/hybrid flexibility). - Strong interpersonal and communication skills to work effectively with cross-functional teams and client stakeholders. - Self-starter mindset with a high sense of ownership, capable of thriving under pressure and tight deadlines.,
Posted 1 week ago
1.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At PwC, our people in audit and assurance focus on providing independent and objective assessments of financial statements, internal controls, and other assurable information enhancing the credibility and reliability of this information with a variety of stakeholders. They evaluate compliance with regulations including assessing governance and risk management processes and related controls. Those in data, analytics and technology solutions at PwC will assist clients in developing solutions that help build trust, drive improvement, and detect, monitor, and predict risk. Your work will involve using advanced analytics, data wrangling technology, and automation tools to leverage data and focus on establishing the right processes and structures to enable our clients to make efficient and effective decisions based on accurate information that is complete and trustworthy. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Requirements Preferred Knowledge/Skills: Assist in collecting, cleaning, and processing data from various sources to support business objectives. Conduct exploratory data analysis to identify trends, patterns, and insights that drive strategic decision-making. Collaborate with team members to design and implement data models and visualizations using tools such as Excel, SQL, Python or Power Bi. Support the preparation of reports and presentations that communicate findings and insights to stakeholders in a clear and concise manner. Participate in the development and maintenance of documentation and data dictionaries to ensure data integrity and governance. Work with cross-functional teams to understand business requirements and deliver data-driven solutions. Stay updated with industry trends and best practices in data analytics and contribute ideas for continuous improvement. Good To Have Experience in a similar role in their current profile. Good accounting knowledge and experience in dealing with financial data are a plus. Knowledge of Azure Databricks / Alteryx / Python / SAS / Knime. Familiarity with data analysis tools and programming languages (e.g., Excel, SQL, Python, Databricks). Basic understanding of Power BI data visualization techniques and tools Strong analytical and problem-solving skills with attention to detail. Education Bachelor’s degree in a related field such as Data Science, Statistics, Mathematics, Computer Science, Economics, or equivalent experience. More than 1 year of experience in data analytics, data science, or a related role. Excellent verbal and written communication skills. Ability to work collaboratively in a team environment and manage multiple tasks efficiently. Eagerness to learn and adapt to new technologies and methodologies. CPA or equivalent certification
Posted 1 week ago
2.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
A career in our Cybersecurity, Privacy and Forensics will provide you the opportunity to solve our clients most critical business and data protection related challenges. You will be part of a growing team driving strategic programs, data analytics, innovation, deals, cyber resilency, response, and technical implementation activities. You will have access to not only the top Cybersecurity, Privacy and Forensics professionals at PwC, but at our clients and industry analysts across the globe. Our Investigations team focuses on helping our clients to detect and investigate fraudulent activities or irregularities within their organisation. As part of our team, you will perform fraud investigations, forensic accounting engagements, litigation support, crisis response, insurance claims support, and address anti-kickback and anti-bribery matters for a diverse group of both public and private multinational clients, not-for profits and state and local governments. You will work closely with the office of general counsel, chief compliance officers, boards and outside counsel. Our team not only helps clients respond to instances of fraud or irregularity, but also works with them to emerge stronger as an entity. To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to be an authentic and inclusive leader, at all grades/levels and in all lines of service. To help us achieve this we have the PwC Professional; our global leadership development framework. It gives us a single set of expectations across our lines, geographies and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future. Responsibilities As a Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self awareness, personal strengths and address development areas. Delegate to others to provide stretch opportunities and coach to help deliver results. Develop new ideas and propose innovative solutions to problems. Use a broad range of tools and techniques to extract insights from from current trends in business area. Review your work and that of others for quality, accuracy and relevance. Share relevant thought leadership. Use straightforward communication, in a structured way, when influencing others. Able to read situations and modify behavior to build quality, diverse relationships. Uphold the firm's code of ethics and business conduct. Overview PwC’s Insights & Forensics (I&F) Data Analytics team in the Acceleration Center (AC) is seeking a highly skilled and motivated Associate to join our dynamic team. The ideal candidate will bring technical expertise, innovative thinking, and strong analytical skills to deliver insights and value to our clients. This role demands a hands-on professional who can seamlessly work with cross-functional teams, process large volumes of data, and create impactful visualizations and analytics solutions. Key Responsibilities Data Analysis & Visualization: Develop and design interactive dashboards, reports, and data visualizations using Power BI and Tableau to provide actionable insights. Collaborate with stakeholders to gather business requirements and transform them into meaningful visuals and stories. Data Engineering & Analytics Development: Utilize Python and PySpark to process, transform, and analyze large datasets efficiently. Implement advanced SQL queries and Alteryx for complex data manipulation, reporting, and validation. Cloud Analytics Solutions: Build, maintain, and optimize analytics workflows using Azure Synapse Analytics and Databricks. Leverage cloud-based solutions to design scalable data pipelines and integrate disparate data sources. Collaboration & Delivery: Partner with US stakeholders and internal teams to understand key business problems and deliver analytics-driven solutions. Document technical workflows, processes, and best practices to ensure high-quality deliverables and knowledge sharing. Required Skills & Qualifications Technical Proficiency: Expert-level knowledge and hands-on experience with Power BI for data visualization and reporting. Proficient in programming with Python and working with distributed computing frameworks like PySpark. Strong command of Advanced SQL and Alteryx for data analysis and manipulation across relational databases. Demonstrated expertise in working with Azure Synapse Analytics and Databricks for building cloud-based data solutions. Problem-Solving & Analytical Thinking: Strong ability to analyze complex business problems and translate them into data-driven solutions. Creative mindset with a focus on delivering value through innovative approaches to analytics and visualization. Communication & Collaboration: Excellent verbal and written communication skills for engaging with stakeholders and explaining technical concepts. Ability to work independently and collaboratively within a team, ensuring high standards of delivery. Education & Experience: Bachelor’s or Master’s degree in Data Science, Data Analytics, Information Systems, or a related field. 2-5 years of relevant experience in data analytics, data engineering, or business intelligence roles. Experience in professional services or consulting is a plus. Good To Have Skills Experience in Prompt Engineering and working with Generative AI (GenAI) solutions. Knowledge of Data Science concepts and methodologies. Familiarity with Machine Learning techniques and frameworks. Education Qualification - Bachelors of commerce degree in Accounting or Finance, preferably with a Certified Fraud Examiner (CFE) or Certified Public Accountant (CPA) certification Masters in accounting or equivalent is good to have
Posted 1 week ago
0.6 - 2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. You are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt, take ownership and consistently deliver quality work that drives value for our clients and success as a team. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role: Specialist Tower: Data, Analytics & Specialist Managed Service Experience: 0.6-2 years Key Skills: Azure Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: India Job Description As a Specialist, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements: Required Skills: Azure Cloud Engineer: Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 6 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 3-5 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data for downstream consumption like Business Intelligence systems, Analytics modeling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient ETL/ELT processes using industry leading tools like Informatica, Talend, SSIS, AWS, Azure, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations. Perform data transformation and processing tasks to prepare the data for analysis and reporting in Azure Databricks or Azure Synapse Analytics for large-scale data transformations using tools like Apache Spark. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice to have: Azure certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Role We are seeking a highly skilled Lead Databricks Engineer to join our team at DXC Technology, where you will play a pivotal role in building and scaling enterprise-grade data solutions on the Azure ecosystem. Youll lead the development of robust data pipelines, analytics frameworks, and cloud-native solutions using cutting-edge tools like Azure Databricks (ADB), Azure Data Factory (ADF), Delta Lake, and PySpark. Responsibilities This role is ideal for professionals who thrive in fast-paced, collaborative environments and are passionate about data engineering, cloud transformation, and data-driven business Responsibilities : Design, develop, and maintain scalable data ingestion and transformation pipelines using Azure Databricks, PySpark, and ADF. Write, optimize, and manage complex SQL queries and transformations across structured and unstructured data sources. Implement data lake solutions using Delta Lake, Unity Catalog, and Azure Synapse Analytics. Apply data profiling, cataloging, and lineage techniques to ensure data quality and governance compliance. Enable secure and governed access to data assets using Azures data governance features and role-based access control (RBAC). Serve as technical lead on large-scale data engineering initiatives, guiding junior developers and ensuring architectural best practices. Lead Proof of Concepts (PoCs), prototype development, and pilot solution implementations to validate technical feasibility. Work closely with solution architects, business analysts, DevOps teams, and client stakeholders to translate requirements into data-driven outcomes. Take part in business process mapping and contribute to data architecture decisions aligned with business needs. Utilize Azure DevOps for CI/CD, automation of data pipelines, and environment management. Establish and enforce coding standards, data modeling conventions, and deployment workflows. Monitor and optimize the performance of data pipelines, ETL/ELT processes, and data storage systems. Troubleshoot and resolve performance bottlenecks and errors in production Qualifications : 5+ years of hands-on experience as a Data Engineer in Azure cloud environments. Azure Databricks (ADB) Azure Data Factory (ADF) Delta Lake / Delta Tables Synapse Analytics Unity Catalog Strong programming proficiency in Python, PySpark, and Spark SQL. Deep understanding of SQL with the ability to write complex queries, joins, window functions, etc. Experience with Azure DevOps for pipeline automation, code repositories (Git), and CI/CD integration. Demonstrated ability to conduct data profiling, data cataloging, and data lineage tracking. Strong knowledge of data governance, security, and compliance within cloud data environments. Excellent analytical thinking, problem-solving, and communication skills. (ref:hirist.tech)
Posted 1 week ago
6.0 - 10.0 years
0 - 0 Lacs
maharashtra
On-site
Job Description About the Job WonDRx (pronounced as Wonder-Rx) is an innovative and disruptive technology platform in healthcare, aiming to connect patients, doctors, and the entire healthcare ecosystem on a single platform. We are looking for a Data Analytics and Research Manager (AI-driven) to lead analytics and insights strategy aligned with our fast-growing product and business goals. This person will manage data pipelines, apply AI/ML models, perform healthcare research, and build a small but high-performing analytics team. Key Responsibilities Define and lead the data and analytics roadmap. Design and manage health data pipelines, dashboards, and KPIs. Apply ML/NLP for patient behavior prediction and analytics automation. Conduct market and competitor research to support business strategies. Collaborate across teams and present insights to CXOs. Mentor a data analytics team ensuring accuracy and impact. Tools & Technologies Languages: SQL, Python/R AI/ML: scikit-learn, TensorFlow BI Tools: Power BI, Tableau, Looker Cloud Stack: BigQuery, Snowflake, AWS, Databricks GenAI Tools: ChatGPT, Copilot, Custom LLMs Qualifications Bachelors/Masters in Data Science, Statistics, Engineering, or related. 6-10 years in analytics with at least 2+ years in a leadership role. Strong business acumen, preferably in healthcare/life sciences. Hands-on AI/ML experience. Excellent communication and storytelling skills. Join us to transform the healthcare experience for millions.,
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
hyderabad, telangana
On-site
NTT DATA is looking for a Data Engineering Senior Director to join the team in Hyderabad, Telangana (IN-TG), India (IN). As a Data & AI Architect, you will be responsible for delivering multi-technology consulting services to clients by providing strategies and solutions for infrastructure and related technology components. Your role will involve collaborating with stakeholders to develop architectural approaches for solutions and working on strategic projects to ensure optimal functioning of clients" technology infrastructure. Key Responsibilities: - Engage in conversations with CEO, Business owners, and CTO/CDO - Analyze complex business challenges, develop effective solutions, and prioritize client needs - Design innovative solutions for complex business problems - Utilize best practices and creativity to address challenges - Conduct market research, formulate perspectives, and communicate insights to clients - Build strong client relationships and ensure client satisfaction - Focus on details with a strategic business perspective - Demonstrate excellent client service orientation - Work effectively in high-pressure situations - Establish and manage processes through collaboration and business understanding - Generate new business opportunities and contribute to internal effectiveness through process improvement - Provide insights on relevant vertical markets and enhance current methodologies, processes, and tools Minimum Skills Required: - Academic Qualifications: BE/BTech or equivalent in Information Technology and/or Business Management - Scaled Agile certification desirable - Relevant consulting and technical certifications preferred, e.g., TOGAF Required Experience: - 12-15 years of experience in a similar role in a large-scale technology services environment - Proficiency in Data, AI, Gen AI, and Agentic AI - Experience in Data Architecture and Solutioning, including E2E Data Architecture and GenAI Solution design - Ability to work on Data & AI RFP responses as a Solution Architect - Strong experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect - Proficiency with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools - Experience in consulting and program execution engagements in AI and data - Expertise in multi-technology infrastructure design, client needs assessment, and change management Additional Career Level Description: - Knowledge and application: Seasoned, experienced professional with complete knowledge in the area of specialization - Problem-solving: Works on diverse problems requiring evaluation and creative solutions - Interaction: Enhances relationships with internal/external partners, often requiring persuasion NTT DATA is a trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. We are committed to helping clients innovate, optimize, and transform for long-term success. With diverse experts in over 50 countries and a robust partner ecosystem, we offer business and technology consulting, data and artificial intelligence solutions, industry-specific services, and digital infrastructure. Join us to be part of a leading provider of digital and AI infrastructure, shaping the future confidently and sustainably. Visit us at us.nttdata.com.,
Posted 1 week ago
6.0 - 10.0 years
0 - 0 Lacs
pune, maharashtra
On-site
We are looking for a highly skilled Technical Data Analyst to join our team and contribute to the establishment of a single source of truth for our high-volume, direct-to-consumer accounting and financial data warehouse. The ideal candidate should possess a solid background in data analysis, SQL, and data transformation, along with prior experience in financial data warehousing and reporting. Your responsibilities will include collaborating closely with finance and accounting teams to gather requirements, develop dashboards, and transform data to facilitate month-end accounting, tax reporting, and financial forecasting. The current financial data warehouse is built in Snowflake and will be transitioning to Databricks. You will be in charge of migrating reporting and transformation processes to Databricks while ensuring the accuracy and consistency of the data. Key Responsibilities: 1. Data Analysis & Reporting: - Construct and maintain month-end accounting and tax dashboards using SQL and Snowsight in Snowflake. - Shift reporting processes to Databricks, creating dashboards and reports to aid finance and accounting teams. - Engage with finance and accounting stakeholders to elicit requirements and provide actionable insights. 2. Data Transformation & Aggregation: - Develop and execute data transformation pipelines in Databricks to aggregate financial data and generate balance sheet look-forward views. - Guarantee data accuracy and consistency during the migration from Snowflake to Databricks. - Collaborate with the data engineering team to enhance data ingestion and transformation processes. 3. Data Integration & ERP Collaboration: - Aid in integrating financial data from the data warehouse into NetSuite ERP by ensuring proper data transformation and validation. - Work with cross-functional teams to maintain seamless data flow between systems. 4. Data Ingestion & Tools: - Familiarize yourself with Fivetran for data ingestion purposes (expertise not mandatory, but familiarity is essential). - Address and resolve data-related issues in collaboration with the data engineering team. Additional Qualifications: - Minimum of 3 years of experience as a Data Analyst or in a similar role, preferably within a financial or accounting setting. - Proficiency in SQL and hands-on experience with Snowflake and Databricks. - Demonstrated ability to construct dashboards and reports for financial data, such as month-end close, tax reporting, and balance sheets. - Knowledge of Fivetran or similar data ingestion tools. - Understanding of financial data concepts like general ledger, journals, balance sheets, and income statements. - Experience with data transformation and aggregation in a cloud-based environment. - Effective communication skills for collaboration with finance and accounting teams. - Desirable: Familiarity with NetSuite ERP or comparable financial systems.,
Posted 1 week ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title : Data Engineering Subject Matter Expert (SME) Location : Dubai, UAE (Hybrid/Onsite) Experience : 10+ years in Data Engineering and ETL with proven leadership and solution delivery experience Job Summary We are seeking a seasoned Data Engineering SME with strong experience in data platforms, ETL tools, and cloud technologies. The ideal candidate will lead the design and implementation of enterprise-scale data solutions, provide strategic guidance on data architecture, and play a key role in data migration, data quality, and performance tuning initiatives. This role demands a mix of deep technical expertise, project management, and stakeholder communication. Key Responsibilities Lead the design, development, and deployment of robust, scalable ETL pipelines and data solutions. Provide technical leadership and SME support for data engineering teams across multiple projects. Collaborate with cross-functional teams including Data Analysts, BI Developers, Product Owners, and IT to gather requirements and deliver data products. Design and optimize data workflows using tools such as IBM DataStage, Talend, Informatica, and Databricks. Implement data integration solutions for structured and unstructured data across on-premise and cloud platforms. Conduct performance tuning and optimization of ETL jobs and SQL queries. Oversee data quality checks, data governance compliance, and PII data protection strategies. Support and mentor team members on data engineering best practices and agile methodologies. Analyze and resolve production issues in a timely manner. Contribute to enterprise-wide data transformation strategies including legacy-to-digital migration using Spark, Hadoop, and cloud platforms. Manage stakeholder communications and provide regular status reports. Required Skills And Qualifications Bachelor's degree in Engineering, Computer Science, or a related field (MTech in Data Science is a plus). 10+ years of hands-on experience in ETL development and data engineering. Strong proficiency with tools : IBM DataStage, Talend, Informatica, Databricks, Power BI, Tableau. Strong SQL, PL/I, Python, and Unix Shell scripting skills. Experience with cloud platforms like AWS and modern big data tools like Hadoop, Spark. Solid understanding of data warehousing, data modeling, and data migration practices. Experience working in Agile/Scrum environments. Excellent problem-solving, communication, and team collaboration skills. Scrum Master or Product Owner certifications (CSM, CSPO) are a plus (ref:hirist.tech)
Posted 1 week ago
3.0 - 4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Bridgenext is Hiring: Azure Data Engineer Job Summary: We are seeking a skilled and experienced Azure Data Engineer to join our dynamic team. The ideal candidate will be responsible for designing, implementing, and maintaining data solutions on the Azure cloud platform. The candidate should have a strong background in data engineering, including ETL processes, data warehousing, and database management. The role requires expertise in Azure services such as Azure Data Factory, Azure SQL Database, Azure Databricks, and other relevant technologies. The Azure Data Engineer will collaborate with cross-functional teams to ensure the successful delivery of scalable and efficient data solutions. Minimum 3-4 Years of Experience in Data Integration platform on cloud (Azure preferred) We're excited to announce an opening for a * Azure Data Engineer * to join our dynamic team! If you’re passionate about building scalable, high-performance applications and thrive in a collaborative environment, we’d love to hear from you. If you are ready for the next challenge & can join us on or before 15th Dec 2024 Job Description- Role: Azure Data Engineer –*Experience:* [3-6 years] Proven experience as a Data Engineer with a focus on Azure cloud technologies. In-depth knowledge of Azure Data Factory, Azure SQL Database, Azure Databricks, and other relevant Azure services. Strong proficiency in SQL, Python, and/or other scripting languages. Experience with data modeling, schema design, and database performance tuning. Familiarity with data warehousing concepts and best practice If you meet these qualifications and are excited about the opportunity to work with cutting-edge Azure technologies to solve complex data challenges, we encourage you to apply. Join our team and be a key contributor to the success of our data-driven initiatives
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
As a Developer contracted by Luxoft for supporting customer initiatives, your main task will involve developing solutions based on client requirements within the Telecom/network work environment. You will be responsible for utilizing technologies such as Databricks and Azure, Apache Spark, Python, SQL, and Apache Airflow to create and manage Databricks clusters for ETL processes. Integration with ADLS, Blob Storage, and efficient data ingestion from various sources including on-premises databases, cloud storage, APIs, and streaming data will also be part of your role. Moreover, you will work on handling secrets using Azure Key Vault, interacting with APIs, and gaining hands-on experience with Kafka/Azure EventHub streaming. Your expertise in data bricks delta APIs, UC catalog, and version control tools like Github will be crucial. Additionally, you will be involved in data analytics, supporting ML frameworks, and integrating with Databricks for model training. Proficiency in Python, Apache Airflow, Microsoft Azure, Databricks, SQL, ADLS, Blob storage, Kafka/Azure EventHub, and various other related skills is a must. The ideal candidate should hold a Bachelor's degree in Computer Science or a related field and possess at least 7 years of experience in development. Problem-solving skills, effective communication abilities, teamwork, and a commitment to continuous learning are essential traits for this role. Desirable skills include exposure to Snowflake, PostGre, Redis, GenAI, and a good understanding of RBAC. Proficiency in English at C2 level is required for this Senior-level position based in Bengaluru, India. This opportunity falls under the Big Data Development category within Cross Industry Solutions and is expected to be effective from 06/05/2025.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
Join the Consumer & Community Banking division at Chase, a leading U.S. financial services firm as a skilled data professional in our Data & Analytics team. As an Analytical Solutions Manager within the Consumer and Community Banking (CCB) Finance Data & Insights Team, you will be a part of an agile product team responsible for the development, production, and transformation of financial data and reporting across the Consumer and Community Banking division. Your ability and passion to think beyond raw and disparate data will enable you to create data visualizations and intelligence solutions that will be utilized by the organization's top leaders to achieve key strategic imperatives. You will assist in identifying and assessing opportunities to eliminate manual processes and utilize automation tools such as Alteryx or Thought Spot to implement automated solutions. You will be responsible for extracting, analyzing, and summarizing data for ad hoc stakeholder requests and play a significant role in transforming our data environment to a modernized cloud platform. Transform raw data into actionable insights, demonstrating a history of learning and implementing new technologies. Lead the Finance Data & Insights Team, an agile product team, taking responsibility for the development, production, and transformation of financial data and reporting across CCB. Improve the lives of our people and increase value to the firm by leveraging the power of data and the best tools to analyze data, generate insights, save time, improve processes & control, and lead the organization in developing skills of the future. Join an agile product team as an Analytical Solutions Manager on the CCB Finance Data & Insights Team, responsible for the development, production, and transformation of financial data and reporting across CCB. Lead conversations with business teams and create data visualizations and intelligence solutions utilized by the organization's top leaders to reach key strategic imperatives. Identify and assess opportunities to eliminate manual processes and utilize automation tools such as Alteryx or Thought Spot to bring automated solutions to life. Extract, analyze, and summarize data for ad hoc stakeholder requests, playing a role in transforming the data environment to a modernized cloud platform. Minimum 8 years of experience in SQL is a MUST. Minimum 8 years of experience developing data visualization and presentations. Experience with data wrangling tools such as Alteryx. Experience with relational databases utilizing SQL to pull and summarize large datasets, report creation and ad-hoc analyses. Knowledge of modern MPP databases and big-data (Hadoop) concepts. Experience in reporting development and testing, and ability to interpret unstructured data and draw objective inferences given known limitations of the data. Demonstrated ability to think beyond raw data and to understand the underlying business context and sense business opportunities hidden in data. Strong written and oral communication skills; ability to communicate effectively with all levels of management and partners from a variety of business functions. Data architecture experience is needed. Preferred qualifications, capabilities and skills include experience with LLM, Hive, Spark SQL, Impala, or other big-data query tool, Home Lending business understanding as a major advantage, and experience with AWS, Databricks, Snowflake, or other Cloud Data Warehouse, Thought Spot experience.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Working at Atlassian, you have the flexibility to choose where you work - whether it's in an office, from home, or a mix of both. This empowers you to have more control over supporting your family, personal goals, and other priorities. At Atlassian, we have the ability to hire individuals in any country where we have a legal entity. Our interviews and onboarding processes are conducted virtually, as we embrace being a distributed-first company. Joining the DevInfra Transformations team at Atlassian means being part of a group that enables teams to deliver quality work at a fast pace. The team's focus is on building an industry-leading internal platform, validating hypotheses with data, and shipping features for both internal and external usage. This unique opportunity allows you to work in a collaborative environment, implement cutting-edge machine learning techniques - especially in forecasting modeling, and solve challenging problems. As a Machine Learning Engineer -2 at Atlassian, your responsibilities will involve working on the development and implementation of advanced machine learning algorithms. You will be training sophisticated models, collaborating with engineering and analytics teams to integrate AI functionality into Atlassian products and platforms. Your role will extend beyond these tasks to ensure that the transformative potential of AI is fully realized across the organization. On your first day in this role, we expect you to have a Bachelor's or Master's degree, preferably in Computer Science or equivalent experience. You should possess at least 5 years of related industry experience in the data science domain. Expertise in Python or Java, the ability to write high-quality production code, familiarity with SQL, knowledge of Spark and cloud data environments (such as AWS, Databricks), and experience in building and scaling machine learning models are essential requirements. You should also be able to communicate complex data science concepts effectively, prioritize business practicality, and have an agile development mindset. It would be advantageous if you have experience working in a consumer or B2C space for a SaaS product provider, or in the enterprise/B2B sector. Additionally, experience in developing deep learning-based models, working on LLM-related applications, and excelling in problem-solving are considered beneficial. At Atlassian, we offer a variety of perks and benefits to support you, your family, and your engagement with the local community. These include health and wellbeing resources, paid volunteer days, and more. To find out more about our perks and benefits, you can visit go.atlassian.com/perksandbenefits. Atlassian is driven by the common goal of unleashing the potential of every team. Our software products are designed to help teams worldwide collaborate effectively, making the impossible achievable through teamwork. To ensure the best experience for you, we are committed to providing accommodations or adjustments at any stage of the recruitment process. Simply inform our Recruitment team during your conversations with them. For more information about our culture and hiring process, you can visit go.atlassian.com/crh.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a DevOps/Data Platform Engineer at our company, you will play a crucial role in supporting our cloud-based infrastructure, CI/CD pipelines, and data platform operations. Your responsibilities will include building and managing CI/CD pipelines using tools like GitHub Actions, Jenkins, and Build Piper, provisioning infrastructure with Terraform, Ansible, and cloud services, as well as managing containers using Docker, Kubernetes, and registries. You will also be required to support data platforms such as Snowflake, Databricks, and data infrastructure, and monitor systems using Grafana, Prometheus, and Datadog. Additionally, securing environments using tools like AWS Secrets Manager and enabling MLOps pipelines will be part of your daily tasks to optimize infrastructure for performance and cost. To excel in this role, you should have strong experience in DevOps, cloud platforms, and Infrastructure as Code (IaC). Proficiency in Linux, automation, and system management is essential, along with familiarity with data platforms like Snowflake, Databricks, and CI/CD practices. Excellent troubleshooting skills, collaboration abilities, and effective communication are also required for this position. It would be advantageous if you have experience with ML model deployment, cost optimization, and infrastructure testing. Familiarity with data security best practices is considered a plus. This is a full-time, permanent position located in Gurgaon. If you possess the required experience and skills and are capable of meeting the outlined responsibilities, we encourage you to apply for this position.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi