Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 4.0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
We are looking for a detail-oriented and skilled Data Analyst to support our data-driven decision-making processes. In this role, you will be responsible for developing data pipelines, building dashboards, and delivering actionable insights using tools such as Databricks, Python, PL/SQL, and Tableau. You will also contribute to the development of basic machine learning models to support predictive and classification use cases. You will collaborate closely with business and technical teams to ensure data accuracy, accessibility, and integrity. Job Title: Data Analyst Experience Required: 2 to 4 years Location: Bangalore Job Type: Full-time Job Description Key Responsibilities Data Analysis & Reporting Analyze structured and semi-structured data to uncover trends, insights, and opportunities. Design interactive dashboards and reports in Tableau for business stakeholders. Interpret data to identify key metrics, patterns, and anomalies. Data Pipeline & ML Support Assist in building and maintaining ETL pipelines using Airflow, Databricks, and Python. Ensure timely and accurate data flow across systems. Contribute to the development and deployment of basic machine learning models for forecasting, segmentation, or classification tasks. SQL & Data Handling Write and optimize PL/SQL queries for extracting and transforming data. Conduct data validation and ensure data consistency across reports and dashboards. Collaboration & Communication Work closely with product, business, and engineering teams to understand data needs. Translate business questions into analytical tasks and deliver results with clear narratives. Data Quality & Governance Support efforts to improve data accuracy, integrity, and governance. Monitor data pipelines and troubleshoot issues as needed. Required Skills & Experience 2–4 years of experience in data analysis or a similar role. Proficiency in SQL/PL-SQL, Python, and Tableau. Exposure to cloud-based data platforms such as Databricks, AWS, or GCP. Basic understanding of machine learning concepts and hands-on experience with simple ML models (e.g., regression, classification) in Python. Strong analytical mindset with attention to detail. Ability to communicate insights clearly through visualizations and storytelling. Familiarity with data quality practices and pipeline monitoring.
Posted 1 week ago
2.0 - 5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary Data Engineer, DT US PxE The Data Engineer is an integral part of the technical application development team and primarily responsible for analyze, plan, design, develop, and implement the Azure Data engineering solutions to meet strategic, usability, performance, reliability, control, and security requirements of Data science processes. Requires demonstrable knowledge in areas of Data engineering, AI/ML, Data warehouse and reporting applications. Must be innovative. Work you will do A unique opportunity to be a part of growing team that works on a premier unified data science analytics platform within Deloitte. You will be responsible for implementing/delivering/supporting Data engineering and AI/ML solutions to support the Deloitte US Member Firm. Outcome-Driven Accountability Collaborate with business and IT leaders to develop and refine ideas for integrating predictive and prescriptive analytics within business processes, ensuring measurable customer and business outcomes. Decompose complex business problems into manageable components, facilitating the use of multiple analytic modeling methods for holistic and valuable solutions. Develop and refine prototypes and proofs of concepts, presenting results to business and IT leaders, and demonstrating the impact on customer needs and business outcomes. Technical Leadership and Advocacy Engage in data analysis, generating and testing hypotheses, preparing and analyzing historical data, identifying patterns, and applying statistical methods to formulate solutions that deliver high-quality outcomes. Develop project plans, including resource needs and task dependencies, to meet project deliverables with a focus on incremental and iterative delivery. Engineering Craftsmanship Participate in defining project scope, objectives, and quality controls for new projects, ensuring alignment with customer-centric engineering principles. Present and communicate project deliverable results, emphasizing the value delivered to customers and the business. Customer-Centric Engineering Assist in recruiting and mentoring team members, fostering a culture of engineering craftsmanship and continuous learning. Incremental and Iterative Delivery Stay abreast of changes in technology, leading new technology evaluations for predictive and statistical analytics, and advocating for innovative, lean, and feasible solutions. Education: Bachelor’s degree in Computer Science or Business Information Systems or MCA or equivalent degree. Qualifications: 2 to 5 years Advanced Level of experience in Azure Data engineering Expertise in Development, deployment and monitoring ADF pipelines (using visual studio and browsers) Expertise in Azure databricks internal programming using (PySpark, SparkR and SparkSQL) or Amazon EMR (Elastic MapReduce). Expertise in managing azure storage (Azure Datalake Gen2, Azure Blob Storage, Azure SQL database) or Azure Blob Storage, Azure Data Lake Storage, Azure Synapse Analytics, Azure Data Factory Advanced programming skills in Python, R and SQL (SQL for HANA, MS SQL) Hands on experience in Visualization tools (Tableau / PowerBI) Hands on experience in Data science studios like (Dataiku, Azure ML studio, Amazon SageMaker) The Team Information Technology Services (ITS) helps power Deloitte’s success. ITS drives Deloitte, which serves many of the world’s largest, most respected organizations. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. The ~3,000 professionals in ITS deliver services including: Security, risk & compliance Technology support Infrastructure Applications Relationship management Strategy Deployment PMO Financials Communications Product Engineering (PxE) Product Engineering (PxE) is the internal software and applications development team responsible for delivering leading-edge technologies to Deloitte professionals. Their broad portfolio includes web and mobile productivity tools that empower our people to log expenses, enter timesheets, book travel and more, anywhere, anytime. PxE enables our client service professionals through a comprehensive suite of applications across the business lines. In addition to application delivery, PxE offers full-scale design services, a robust mobile portfolio, cutting-edge analytics, and innovative custom development. How you will grow At Deloitte, we have invested a great deal to create a rich environment in which our professionals can grow. We want allourpeopletodevelopintheirownway,playingtotheirownstrengthsastheyhonetheirleadershipskills.And,asa part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. Notwopeoplelearninexactlythesameway.So,weprovidearangeofresources,includingliveclassrooms,team-based learning, and eLearning. Deloitte University (DU): The Leadership Center in India, our state-of-the-art, world-class learningcenterintheHyderabadoffice,isanextensionoftheDUinWestlake,Texas,andrepresentsatangiblesymbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center inIndia . Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Ourpositiveandsupportivecultureencouragesourpeopletodotheirbestworkeveryday.Wecelebrateindividualsby recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered,confident,andaware.Weofferwell-beingprogramsandarecontinuouslylookingfornewwaystomaintaina culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life atDeloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Disclaimer: Please note that this description is subject to change basis business/engagement requirements and at the discretion of the management. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302720
Posted 1 week ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Requisition Number: 101627 Architect II Location: The role will be a hybrid position located in Delhi NCR, Hyderabad, Pune, Trivandrum and Bangalore, India Insight at a Glance 14,000+ engaged teammates globally #20 on Fortune’s World's Best Workplaces™ list $9.2 billion in revenue Received 35+ industry and partner awards in the past year $1.4M+ total charitable contributions in 2023 by Insight globally Now is the time to bring your expertise to Insight. We are not just a tech company; we are a people-first company. We believe that by unlocking the power of people and technology, we can accelerate transformation and achieve extraordinary results. As a Fortune 500 Solutions Integrator with deep expertise in cloud, data, AI, cybersecurity, and intelligent edge, we guide organisations through complex digital decisions. About the role The Architect-II Data will focus on leading our Business Intelligence (BI) and Data Warehousing (DW) initiatives. This role involves designing and implementing end-to-end data pipelines using cloud services and data frameworks. They will collaborate with stakeholders and ETL/BI developers in an agile environment to create scalable, secure data architectures ensuring alignment with business requirements, industry best practices, and regulatory compliance. Responsibilities Architect and implement end-to-end data pipelines, data lakes, and warehouses using modern cloud services and architectural patterns. Develop and build analytics tools that deliver actionable insights to the business. Integrate and manage large, complex data sets to meet strategic business requirements. Optimize data processing workflows using frameworks such as PySpark. Establish and enforce best practices for data quality, integrity, security, and performance across the entire data ecosystem. Collaborate with cross-functional teams to prioritize deliverables and design solutions. Develop compelling business cases and return on investment (ROI) analyses to support strategic initiatives. Drive process improvements for enhanced data delivery speed and reliability. Provide technical leadership, training, and mentorship to team members, promoting a culture of excellence. Qualification 10+ years in Business Intelligence (BI) solution design, with 8+ years specializing in ETL processes and data warehouse architecture. 8+ years of hands-on experience with Azure Data services including Azure Data Factory, Azure Databricks, Azure Data Lake Gen2, Azure SQL DB, Synapse, Power BI, and MS Fabric (Knowledge) Strong Python and PySpark software engineering proficiency, coupled with a proven track record of building and optimizing big data pipelines, architectures, and datasets. Proficient in transforming, processing, and extracting insights from vast, disparate datasets, and building robust data pipelines for metadata, dependency, and workload management. Familiarity with software development lifecycles/methodologies, particularly Agile. Experience with SAP/ERP/Datasphere data modeling is a significant plus. Excellent presentation and collaboration skills, capable of creating formal documentation and supporting cross-functional teams in a dynamic environment. Strong problem-solving, time management, and organizational abilities. Keen to learn new languages and technologies continually. Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or an equivalent field. Benefits What you can expect We’re legendary for taking care of you, your family and to help you engage with your local community. We want you to enjoy a full, meaningful life and own your career at Insight. Some of our benefits include: Freedom to work from another location—even an international destination—for up to 30 consecutive calendar days per year. But what really sets us apart are our core values of Hunger, Heart, and Harmony, which guide everything we do, from building relationships with teammates, partners, and clients to making a positive impact in our communities. Join us today, your ambITious journey starts here. Insight is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation or any other characteristic protected by law. When you apply, please tell us the pronouns you use and any reasonable adjustments you may need during the interview process.At Insight, we celebrate diversity of skills and experience so even if you don’t feel like your skills are a perfect match - we still want to hear from you! Insight is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation or any other characteristic protected by law. Insight India Location:Level 16, Tower B, Building No 14, Dlf Cyber City In It/Ites Sez, Sector 24 &25 A Gurugram Gurgaon Hr 122002 India
Posted 1 week ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Solutions Lead – Commercial & Specialty Insurance (BPM & Digital transformation) Location: India (Hybrid) Department: Insurance IGM (EMEA) Reports To: Solutions and Capability Leader, Insurance Grade: D1/D2 ________________________________________ Job Summary: We are looking for a strategic and technically adept Solutions Lead to drive the design and delivery of winning BPM solutions for RFPs and proposals, with a strong focus on the Insurance industry. The ideal candidate will combine deep domain knowledge with expertise in digital technologies, AI, and data-driven transformation to craft innovative, scalable, and client-centric solutions. ________________________________________ Key Responsibilities: Lead the solutioning and technical response for BPM-related RFPs, RFIs, and proposals in the insurance sector (preferably UK Commercial and Specialty Insurance). Collaborate with sales, delivery, product, and technology teams to design end-to-end solutions that align with client needs and strategic goals. Translate complex business requirements into digital-first BPM solutions leveraging automation, AI/ML, analytics, and cloud platforms. Develop solution blueprints, architecture diagrams, and value propositions tailored to insurance clients Present solutions to internal stakeholders and clients through orals, workshops, and demos. Stay current with industry trends, regulatory changes, and emerging technologies in insurance and BPM. Support cost modeling, risk assessment, and transition planning for proposed solutions. Contribute to the development of reusable assets, accelerators, and best practices. ________________________________________ Required Qualifications: Bachelor’s degree in Engineering, Computer Science, or related field (Master’s or MBA preferred). 10+ years of experience in BPM, digital transformation, or solution architecture roles. Proven experience in insurance domain processes, systems, and regulatory environments. Strong understanding of BPM platforms (e.g. Appian), AI/ML, RPA, and data analytics. Experience in crafting responses for large, complex RFPs and client engagements. Excellent communication, presentation, and stakeholder management skills. Familiarity with cloud ecosystems (AWS, Azure, GCP) and data platforms (Snowflake, Databricks, etc.). ________________________________________ Preferred Skills: Experience with digital underwriting, claims automation, policy administration, or customer experience transformation in UK Commercial and Specialty Insurance. Prior experience of working within the Lloyd’s and London Market landscape Certifications in BPM tools, cloud platforms, or AI/ML frameworks are a plus. Experience working with global delivery teams and offshore models. Familiarity with regulatory and compliance requirements in the insurance industry. Strong analytical and storytelling skills to craft compelling value propositions. ________________________________________ Why Join Us? Be at the forefront of digital innovation in the insurance industry. Work with a collaborative, high-impact team on strategic deals. Access to cutting-edge tools, platforms, and learning opportunities. Job Title: Solutions Lead – Commercial & Specialty Insurance (BPM & Digital transformation) Location: India (Hybrid) Department: Insurance IGM (EMEA) Reports To: Solutions and Capability Leader, Insurance Grade: D1/D2 Job Summary: We are looking for a strategic and technically adept Solutions Lead to drive the design and delivery of winning BPM solutions for RFPs and proposals, with a strong focus on the Insurance industry . The ideal candidate will combine deep domain knowledge with expertise in digital technologies, AI, and data-driven transformation to craft innovative, scalable, and client-centric solutions. Key Responsibilities: Lead the solutioning and technical response for BPM-related RFPs, RFIs, and proposals in the insurance sector (preferably UK Commercial and Specialty Insurance). Collaborate with sales, delivery, product, and technology teams to design end-to-end solutions that align with client needs and strategic goals. Translate complex business requirements into digital-first BPM solutions leveraging automation, AI/ML, analytics, and cloud platforms. Develop solution blueprints, architecture diagrams, and value propositions tailored to insurance clients Present solutions to internal stakeholders and clients through orals, workshops, and demos. Stay current with industry trends, regulatory changes, and emerging technologies in insurance and BPM. Support cost modeling, risk assessment, and transition planning for proposed solutions. Contribute to the development of reusable assets, accelerators, and best practices. Required Qualifications: Bachelor’s degree in Engineering, Computer Science, or related field (Master’s or MBA preferred). 10+ years of experience in BPM, digital transformation, or solution architecture roles. Proven experience in insurance domain processes, systems, and regulatory environments. Strong understanding of BPM platforms (e.g. Appian), AI/ML, RPA, and data analytics. Experience in crafting responses for large, complex RFPs and client engagements. Excellent communication, presentation, and stakeholder management skills. Familiarity with cloud ecosystems (AWS, Azure, GCP) and data platforms (Snowflake, Databricks, etc.). Preferred Skills: Experience with digital underwriting, claims automation, policy administration, or customer experience transformation in UK Commercial and Specialty Insurance. Prior experience of working within the Lloyd’s and London Market landscape Certifications in BPM tools, cloud platforms, or AI/ML frameworks are a plus. Experience working with global delivery teams and offshore models. Familiarity with regulatory and compliance requirements in the insurance industry. Strong analytical and storytelling skills to craft compelling value propositions. Why Join Us? Be at the forefront of digital innovation in the insurance industry. Work with a collaborative, high-impact team on strategic deals. Access to cutting-edge tools, platforms, and learning opportunities. Competitive compensation, benefits, and career growth. Competitive compensation, benefits, and career growth.
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Why Join 7-Eleven Global Solution Center? When you join us, you'll embrace ownership as teams within specific product areas take responsibility for end-to-end solution delivery, supporting local teams and integrating new digital assets. Challenge yourself by contributing to products deployed across our extensive network of convenience stores, processing over a billion transactions annually. Build solutions for scale, addressing the diverse needs of our 84,000+ stores in 19 countries. Experience growth through cross-functional learning, encouraged and applauded at 7-Eleven GSC. With our size, stability, and resources, you can navigate a rewarding career. Embody leadership and service as 7-Eleven GSC remains dedicated to meeting the needs of customers and communities. Why We Exist, Our Purpose and Our Transformation? 7-Eleven is dedicated to being a customer-centric, digitally empowered organization that seamlessly integrates our physical stores with digital offerings. Our goal is to redefine convenience by consistently providing top-notch customer experiences and solutions in a rapidly evolving consumer landscape. Anticipating customer preferences, we create and implement platforms that empower customers to shop, pay, and access products and services according to their preferences. To achieve success, we are driving a cultural shift anchored in leadership principles, supported by the realignment of organizational resources and processes. At 7-Eleven we are guided by our Leadership Principles . Each principle has a defined set of behaviours which help guide the 7-Eleven GSC team to Serve Customers and Support Stores. Be Customer Obsessed Be Courageous with Your Point of View Challenge the Status Quo Act Like an Entrepreneur Have an “It Can Be Done” Attitude Do the Right Thing Be Accountable About This Opportunity We are seeking a highly skilled senior AI/ML engineer to design , implement and deploy AI/ML solutions that drive innovation and efficiency. The ideal candidate will have extensive experience in Langchain, NLP, RAG based systems , Prompt Engineering , Agentic Systems and cloud platforms(Azure,AWS) and be adept in building AI driven applications. Responsibilities: Design and implement AI driven solutions using advanced frameworks and technologies, ensuring scalability and efficiency. Develop and optimize Langchain(agents, chains, memories, parsers, document loaders),Gen AI and NLP models for specific use cases. Quickly experiment different machine learning models for specific use case. Strong problem-solving capabilities and ability to quickly propose feasible solutions and effectively communicate strategy and risk mitigation approaches to leadership. Educational & Required Qualifications: Overall 4-6 years of experience in software development , particularly AI/ML-based projects Must have experience in Azure cloud and databricks setup. Proficiency in Python and machine learning frameworks like Tensorflow, Pytorch, scikit-learn. Strong understanding of machine learning algorithms, including supervised and unsupervised learning, reinforcement learning, and deep learning. Strong expertise in Generative AI, NLP, and conversational AI technologies. Experience in building and deploying AI-powered applications at scale. Strong understanding of ML pipelines, feature engineering, and MLOps Experience in LLM + RAG and building agentic framework. Excellent written and verbal communications skills. Experience leading and mentoring team members to help grow to their full potential. Ability to understand business requirements and translate into technical requirements. Educational Background: Bachelor’s or master’s degree in computer science, Artificial Intelligence, Machine Learning, or a related field. Familiarity with code versioning tools - Git (Gitlab). Exposure to retail industry, experience with e-commerce applications. 7-Eleven Global Solution Center is an Equal Opportunity Employer committed to diversity in the workplace. Our strategy focuses on three core pillars – workplace culture, diverse talent and how we show up in the communities we serve. As the recognized leader in convenience, the 7-Eleven family of brands embraces diversity, equity and inclusion (DE+I). It’s not only the right thing to do for customers, Franchisees and employees—it’s a business imperative.
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Why Join 7-Eleven Global Solution Center? When you join us, you'll embrace ownership as teams within specific product areas take responsibility for end-to-end solution delivery, supporting local teams and integrating new digital assets. Challenge yourself by contributing to products deployed across our extensive network of convenience stores, processing over a billion transactions annually. Build solutions for scale, addressing the diverse needs of our 84,000+ stores in 19 countries. Experience growth through cross-functional learning, encouraged and applauded at 7-Eleven GSC. With our size, stability, and resources, you can navigate a rewarding career. Embody leadership and service as 7-Eleven GSC remains dedicated to meeting the needs of customers and communities. Why We Exist, Our Purpose and Our Transformation? 7-Eleven is dedicated to being a customer-centric, digitally empowered organization that seamlessly integrates our physical stores with digital offerings. Our goal is to redefine convenience by consistently providing top-notch customer experiences and solutions in a rapidly evolving consumer landscape. Anticipating customer preferences, we create and implement platforms that empower customers to shop, pay, and access products and services according to their preferences. To achieve success, we are driving a cultural shift anchored in leadership principles, supported by the realignment of organizational resources and processes. At 7-Eleven we are guided by our Leadership Principles . Each principle has a defined set of behaviours which help guide the 7-Eleven GSC team to Serve Customers and Support Stores. Be Customer Obsessed Be Courageous with Your Point of View Challenge the Status Quo Act Like an Entrepreneur Have an “It Can Be Done” Attitude Do the Right Thing Be Accountable Responsibilities: 3-5 years of combined experience in Azure Data Engineering projects Proficient with Spark topics: pyspark, Databricks, Synapse Notebooks Proficient with Azure Data services (like Azure Synapse, Azure Data Factory, Azure security model, Azure Data Lake, …) Strong SQL DW knowledge. Must have 1+ years of development experience with spark framework. Exposure to streaming technologies and messaging bus like Kafka/rabbit MQ. Proficient with creating pipelines and activities using both Azure and On-Prem data stores for full and incremental data loads into a Cloud DW Proficient in data topics: management, processing, cleaning, transforming, aggregating, security, and modeling using AAS/PBI Working knowledge of creating Power BI reports and dashboards Must have strong programming skills in altleast one programming language like Python/Java. Strong sql and data analysis skills. Experience working within a distributed team, following an Agile methodology with a DevOps approach using GIT or TFS Bachelor in computer science or related, Master degree considered as a plus 7-Eleven Global Solution Center is an Equal Opportunity Employer committed to diversity in the workplace. Our strategy focuses on three core pillars – workplace culture, diverse talent and how we show up in the communities we serve. As the recognized leader in convenience, the 7-Eleven family of brands embraces diversity, equity and inclusion (DE+I). It’s not only the right thing to do for customers, Franchisees and employees—it’s a business imperative. Privileges & Perquisites: 7-Eleven Global Solution Center offers a comprehensive benefits plan tailored to meet the needs and improve the overall experience of our employees, aiding in the management of both their professional and personal aspects. Work-Life Balance: Encouraging employees to unwind, recharge, and find balance, we offer flexible and hybrid work schedules along with diverse leave options. Supplementary allowances and compensatory days off are provided for specific work demands. Well-Being & Family Protection: Comprehensive medical coverage for spouses, children, and parents/in-laws, with voluntary top-up plans, OPD coverage, day care services, and access to health coaches. Additionally, an Employee Assistance Program with free, unbiased and confidential expert consultations for personal and professional issues. Top of Form Wheels and Meals: Free transportation and cafeteria facilities with diverse menu options including breakfast, lunch, snacks, and beverages, customizable and health-conscious choices. Certification & Training Program: Sponsored training for specialized certifications. Investment in employee development through labs and learning platforms. Hassel free Relocation: Support and reimbursement for newly hired employees relocating to Bangalore, India.
Posted 1 week ago
6.0 - 11.0 years
20 - 35 Lacs
Hyderabad
Remote
Databricks Administrator Azure/AWS | Remote | 6+ Years Job Description: We are looking for an experienced Databricks Administrator to manage and optimize our Databricks environment on AWS . You will be responsible for setting up and maintaining workspaces, clusters, access control, and integrations, while ensuring security, performance, and governance. Key Responsibilities: Databricks Administration: Manage Databricks workspaces, clusters, and jobs across AWS. User & Access Management: Control user roles, permissions, and workspace-level security. Unity Catalog & Data Governance: Set up and manage Unity Catalog, implement data governance policies. Security & Network Configuration: Configure encryption, authentication, VPCs, private links, and networking on AWS. Integration & Automation: Integrate with cloud services, BI tools, and automate processes using Python, Terraform, and Git. Monitoring & CI/CD: Implement monitoring (CloudWatch, Prometheus, etc.), and manage CI/CD pipelines using GitLab, Jenkins, or similar. Collaboration: Work closely with data engineers, analysts, and DevOps teams to support data workflows. Must-Have Skills: Strong experience with Databricks on AWS Unity Catalog setup and governance best practices AWS network/security configuration (VPC, IAM, KMS) Experience with CI/CD tools (Git, Jenkins, etc.) Terraform and Infrastructure as Code (IaC) Scripting knowledge in Python or Shell Email : Hrushikesh.akkala@numerictech.com Phone /Whatsapp : 9700111702 For immediate response and further opportunities, connect with me on LinkedIn: https://www.linkedin.com/in/hrushikesh-a-74a32126a/
Posted 1 week ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Responsibilities Build and maintain secure, scalable data pipelines using Databricks and Azure Handle ingestion from diverse sources (files, APIs, streaming), data transformation, and quality validation Collaborate with subsystem data science and product teams for ML readiness Requirements Skills & Experience Technical: Notebooks (SQL, Python), Delta Lake, Unity Catalog, ADLS/S3, job orchestration, APIs, structured logging, IaC (Terraform) Delivery: Trunk-based development, TDD, Git, CI/CD for notebooks and pipelines Integration: Familiar with JSON, CSV, XML, Parquet, SQL/NoSQL/graph databases Communication: Able to justify decisions, document architecture, and align with enabling teams Benefits What you get Best in class salary: We hire only the best, and we pay accordingly Proximity Talks: Meet other designers, engineers, and product geeks — and learn from experts in the field Keep on learning with a world-class team: Work with the best in the field, challenge yourself constantly, and learn something new every day This is a contract role based in Abu Dhabi. If relocation from India is required, the company will cover travel and accommodation expenses in addition to your salary About Us Proximity is the trusted technology, design, and consulting partner for some of the biggest Sports, Media and Entertainment companies in the world! We're headquartered in San Francisco and have offices in Palo Alto, Dubai, Mumbai, and Bangalore. Since 2019, Proximity has created and grown high-impact, scalable products used by 370 million daily users, with a total net worth of $45.7 billion among our client companies. We are Proximity — a global team of coders, designers, product managers, geeks, and experts. We solve complex problems and build cutting edge tech, at scale. Our team of Proxonauts is growing quickly, which means your impact on the company's success will be huge. You'll have the chance to work with experienced leaders who have built and led multiple tech, product and design teams. Here's a quick guide to getting to know us better: Watch our CEO, Hardik Jagda, tell you all about Proximity Read about Proximity's values and meet some of our Proxonauts here Explore our website, blog, and the design wing — Studio Proximity Get behind-the-scenes with us on Instagram! Follow @ProxWrks and @H.Jagda
Posted 1 week ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
Job Title: Azure Data Engineer Job Summary: We are looking for a Data Engineer with hands-on experience in the Azure ecosystem. You will be responsible for designing, building, and maintaining both batch and real-time data pipelines using Azure cloud services. Key Responsibilities: Develop and maintain data pipelines using Azure Synapse Analytics, Data Factory, and DataBricks Work with real-time streaming tools like Azure Event Hub, Streaming Analytics, and Apache Kafka Design and manage data storage using ADLS Gen2, Blob Storage, Cosmos DB, and SQL Data Warehouse Use Spark (Python/Scala) for data processing in DataBricks Implement data workflows with tools like Apache Airflow and dbt Automate processes using Azure Functions and Python Ensure data quality, performance, and security Required Skills: Strong knowledge of Azure Data Platform (Synapse, ADLS2, Data Factory, Event Hub, Cosmos DB) Experience with Spark (in DataBricks), Python or Scala Familiar with tools like Azure Purview, dbt, and Airflow Good understanding of real-time and batch processing architectures
Posted 1 week ago
2.0 - 4.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Data Engineer -I/II - IN (Operations/ Support) Work Timings -24x7 (IST) Work Location - Remote Experience: 2-4 years Job Description Summary The Data engineer is responsible for managing and operating upon Tableau, Tableau bridge server, Databricks , Dbt, SQL , SSRS, SSIS, AWS DWS, AWS APP Flow, PowerBI. The engineer will work closely with the customer and team lead to manage and operate cloud data platform. JOB COMPLEXITY: This role requires extensive problem solving skills and the ability to research an issue, determine the root cause, and implement the resolution; research of various sources such as Databricks/AWS/Tableau documentation that may be required to identify and resolve issues. Must have the ability to prioritize issues and multi-task SUPERVISION: Works under moderate supervision EXPERIENCE/EDUCATION: Requires a Bachelor’s degree in computer science or other related field plus 2-4 years of hands-on experience in configuring and managing Tableau/Databricks and SQL based data analytics solution. Experience with Tableau/Databricks and SQL Datawarehouse environment is desired PHYSICAL DEMANDS: General office environment. No special physical demands required. Schedule flexibility to include working a weekend day regularly and holidays as required by the business for 24/7 operations. Occasional travel, less than 10% POLICY COMPLIANCE: Responsible for adhering to company security policies and procedures and any other relevant policies and standards Knowledge/ Skills Good hands on Tableau, Tableau bridge server, Databricks, SSRS/ SSIS, AWS DWS, AWS APP Flow, PowerBI. Ability to read and write sql and stored procedures. Experience on AWS Good hands on experience in configuring, managing and troubleshooting along with general analytical and problem solving skills. Excellent written and verbal communication skills. Ability to communicate technical info and ideas so others will understand. Ability to successfully work and promote inclusiveness in small groups. Job Responsibilities Troubleshooting incident/problem, includes collecting logs, cross-checking against known issues, investigate common root causes (for example failed batches, infra related items such as connectivity to source, network issues etc.) Knowledge Management: Create/update runbooks as needed / Entitlements Governance: Watch all the configuration changes to batches and infrastructure (cloud platform) along with mapping it with proper documentation and aligning resources Communication: Lead and act as a POC for customer from off-site, handling communication, escalation, isolating issues and coordinating with off-site resources while level setting expectation across stakeholders Change Management: Align resources for on-demand changes and coordinate with stakeholders as required Request Management: Handle user requests – if the request is not runbook-based create a new KB or update runbook accordingly Incident Management and Problem Management, Root cause Analysis, coming up with preventive measures and recommendations such as enhancing monitoring or systematic changes as needed About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know. Apply for this job
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Business Unit: Cubic Transportation Systems Company Details: When you join Cubic, you become part of a company that creates and delivers technology solutions in transportation to make people’s lives easier by simplifying their daily journeys, and defense capabilities to help promote mission success and safety for those who serve their nation. Led by our talented teams around the world, Cubic is committed to solving global issues through innovation and service to our customers and partners. We have a top-tier portfolio of businesses, including Cubic Transportation Systems (CTS) and Cubic Defense (CD). Explore more on Cubic.com. Job Details: Job Summary: We are seeking a hands-on and highly skilled Principal Data Analyst to join our dynamic DMAP team. The ideal candidate will have extensive experience in SQL, Power BI, data modeling, and a strong understanding of analytics and ETL processes. This role requires a proactive individual with strong analytical thinking, an appetite for learning emerging technologies, and a commitment to delivery excellence. Key Responsibilities: Design, develop, and optimize Power BI reports and dashboards. Write and tune complex SQL queries and joins for data extraction and analysis. Develop advanced DAX measures and optimize performance. Implement Row-Level Security (RLS) and create scalable data models (star/snowflake schemas). Perform data transformations and integrate data from various sources. Collaborate with engineering and business teams to translate requirements into reporting solutions. Support Power BI Service administration and deployment best practices. Contribute to cloud-based data solutions using Azure services. Support change requests and provide production-level analytics support. Maintain data governance standards including GDPR and PII handling. Work closely on a daily basis with the backend engineering team to ensure seamless data integration and alignment of deliverables. Required Skills and Qualifications: Minimum 5+ years of hands-on experience in: SQL (Advanced concepts, joins, regular querying) Power BI (Report development, DAX, RLS, performance tuning) Data Modeling (Tabular, star, snowflake) Analytics and ETL concepts Proficiency in Power BI Fabric, Data Connectors, and transformation techniques. Strong understanding of Azure components: Active Directory, Azure SQL, Azure Data Factory. Experience with data governance practices (GDPR, PII). Preferred Skills (Good to Have): Familiarity with Power BI license types and Service Admin tasks. Knowledge of Delta Tables, Databricks, Synapse, Data Lakes, and Warehouses. Exposure to QLIK Replicate, Oracle Golden Gate (OGG). Working knowledge of Power Automate, Logic Apps. Basic understanding of Python. Experience with other BI tools like Tableau, Google Data Studio. Microsoft Certified: Power BI Data Analyst Associate Awareness of data engineering and advanced analytics concepts. Soft Skills & Expectations: Strong logical and analytical problem-solving abilities. High levels of initiative and proactiveness. Good communication skills – must be able to express ideas clearly and confidently in front of stakeholders. Willingness to learn new technologies and adapt to changing requirements. A committed and dependable work ethic; should not exhibit complacency. Worker Type: Employee
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role: Data Scientist Experience : 8year+ Location : Ahmedabad /Hyderabad We are seeking a Data Scientist to develop and implement advanced predictive models and optimization solutions. The ideal candidate will have expertise in predictive modelling, integer programming, Python development, and cloud-based data processing. This role will involve working with large datasets, solving complex and managing cloud infrastructure for scalable solutions. Key Responsibilities: Develop and implement models using statistical methods (e.g., Bayesian models). Solve optimization problems such as bin packing, TSP, and clustering using integer programming. Develop Python-based solutions using Git/Poetry for code and library management. Work with data processing libraries such as Pandas, Polars, and others. Deploy and manage data pipelines on Databricks and Azure Blob Storage. Monitor and troubleshoot pipelines, logs, and cloud resources. Implement DevOps best practices (nice-to-have) for automation and CI/CD workflows. Utilize Power Apps/Power Automate for workflow automation and business process improvement. Ensure cloud cost optimization and performance tuning for scalable architectures. Required Skills & Qualifications: Strong experience in predictive modeling and statistical techniques (Bayesian modeling preferred). Hands-on experience with integer programming and clustering methods. Proficiency in Python, including experience with Git/Poetry for code and dependency management. Expertise in data processing libraries such as Pandas, Polars, or equivalent. Familiarity with Azure cloud services, Databricks, and Azure Blob Storage. Ability to read and analyze logs for debugging and performance monitoring. Experience with cloud management and optimizing resources. Knowledge of monitoring pipelines and troubleshooting issues. Strong problem-solving skills and ability to work with large-scale datasets. Preferred Qualifications: Exposure to DevOps practices, including CI/CD pipelines and automation. Familiarity with Power Apps/Power Automate for process automation. Strong background in cloud cost management and performance tuning
Posted 1 week ago
2.0 - 4.0 years
0 Lacs
India
Remote
Data Engineer -I/II - IN (Operations/ Support) Work Timings -24x7 (IST) Work Location - Remote Experience: 2-4 years Job Description Summary The Data engineer is responsible for managing and operating upon Tableau, Tableau bridge server, Databricks , Dbt, SQL , SSRS, SSIS, AWS DWS, AWS APP Flow, PowerBI. The engineer will work closely with the customer and team lead to manage and operate cloud data platform. JOB COMPLEXITY: This role requires extensive problem solving skills and the ability to research an issue, determine the root cause, and implement the resolution; research of various sources such as Databricks/AWS/Tableau documentation that may be required to identify and resolve issues. Must have the ability to prioritize issues and multi-task SUPERVISION: Works under moderate supervision EXPERIENCE/EDUCATION: Requires a Bachelor’s degree in computer science or other related field plus 2-4 years of hands-on experience in configuring and managing Tableau/Databricks and SQL based data analytics solution. Experience with Tableau/Databricks and SQL Datawarehouse environment is desired PHYSICAL DEMANDS: General office environment. No special physical demands required. Schedule flexibility to include working a weekend day regularly and holidays as required by the business for 24/7 operations. Occasional travel, less than 10% POLICY COMPLIANCE: Responsible for adhering to company security policies and procedures and any other relevant policies and standards Knowledge/ Skills Good hands on Tableau, Tableau bridge server, Databricks, SSRS/ SSIS, AWS DWS, AWS APP Flow, PowerBI. Ability to read and write sql and stored procedures. Experience on AWS Good hands on experience in configuring, managing and troubleshooting along with general analytical and problem solving skills. Excellent written and verbal communication skills. Ability to communicate technical info and ideas so others will understand. Ability to successfully work and promote inclusiveness in small groups. Job Responsibilities Troubleshooting incident/problem, includes collecting logs, cross-checking against known issues, investigate common root causes (for example failed batches, infra related items such as connectivity to source, network issues etc.) Knowledge Management: Create/update runbooks as needed / Entitlements Governance: Watch all the configuration changes to batches and infrastructure (cloud platform) along with mapping it with proper documentation and aligning resources Communication: Lead and act as a POC for customer from off-site, handling communication, escalation, isolating issues and coordinating with off-site resources while level setting expectation across stakeholders Change Management: Align resources for on-demand changes and coordinate with stakeholders as required Request Management: Handle user requests – if the request is not runbook-based create a new KB or update runbook accordingly Incident Management and Problem Management, Root cause Analysis, coming up with preventive measures and recommendations such as enhancing monitoring or systematic changes as needed About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know. Apply for this job
Posted 1 week ago
5.0 - 7.0 years
7 - 9 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Work Mode: Remote Contract Duration: 6 Months to 1 Year Location: Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune,Remote (Open to candidates across India) Job Overview: We are seeking a highly skilled Technical Data Analyst for a remote contract position (6 to 12 months) to help build a single source of truth for our high-volume direct-to-consumer accounting and financial data warehouse. You will work closely with Finance & Accounting teams and play a pivotal role in dashboard creation, data transformation, and migration from Snowflake to Databricks. Key Responsibilities: 1. Data Analysis & Reporting Develop month-end accounting and tax dashboards using SQL in Snowflake (Snowsight) Migrate and transition reports/dashboards to Databricks Gather, analyze, and transform business requirements from finance/accounting stakeholders into data products 2. Data Transformation & Aggregation Build transformation pipelines in Databricks to support balance sheet look-forward views Maintain data accuracy and consistency throughout the Snowflake Databricks migration Partner with Data Engineering to optimize pipeline performance 3. ERP & Data Integration Support integration of financial data with NetSuite ERP Validate transformed data to ensure correct ingestion and mapping into ERP systems 4. Ingestion & Data Ops Work with Fivetran for ingestion and resolve any pipeline or data accuracy issues Monitor data workflows and collaborate with engineering teams on troubleshooting Required Skills & Qualifications: 5+ years of experience as a Data Analyst (preferably in Finance/Accounting domain) Strong in SQL, with proven experience in Snowflake and Databricks Experience in building financial dashboards (month-end close, tax reporting, balance sheets) Understanding of financial/accounting data: GL, journal entries, balance sheet, income statements Familiarity with Fivetran or similar data ingestion tools Experience with data transformation in a cloud environment Strong communication and stakeholder management skills Nice to have: Experience working with NetSuite ERP Apply Now: Please share your updated resume with the following details: Full Name Total Experience Relevant Experience in SQL, Snowflake, Databricks Experience in Finance or Accounting domain Current Location Availability (Notice Period) Current and Expected Rate
Posted 1 week ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
We are hiring a Data Engineer to design and manage data pipelines from factory floors to the Azure cloud, supporting our central data lakehouse architecture. You'll work closely with OT engineers, architects, and AI teams to move data from edge devices into curated layers (Bronze → Silver → Gold), ensuring high data quality, security, and performance. Your work will directly enable advanced analytics and AI in production and operations. Key Job functions Build data ingestion and transformation pipelines using Azure Data Factory, IoT Hub, and Databricks 2) Integrate OT sensor data using protocols like OPC-UA and MQTT Design Medallion architecture flows with Delta Lake and Synapse Monitor and optimize data performance and reliability- Implement data quality, observability, and lineage practices (e.g., with Purview or Unity Catalog) Collaborate with OT and IT teams to ensure contextualized, usable data
Posted 1 week ago
15.0 years
0 Lacs
Thane, Maharashtra, India
On-site
Key Responsibilities: Platform Stabilization & Operational Excellence: Accountable for stable, reliable, and secure operations across all Datawarehouse applications, ensuring adherence to defined SLAs and KPIs. Assess the current data platform architecture, identify bottlenecks, and implement solutions to ensure high availability, reliability, performance, and scalability. Establish robust monitoring, alerting, and incident management processes for all data pipelines and infrastructure. Drive initiatives to improve data quality, consistency, and trustworthiness across the platform. Oversee the operational health and day-to-day management of existing data systems during the transition period. Manage relationships with strategic vendors across the enterprise applications landscape, ensuring strong performance, innovation contributions, and commercial value. Platform Modernization & Architecture: Define and execute a strategic roadmap for modernizing PerkinElmer's data platform, leveraging cloud-native technologies (AWS, Azure, or GCP) and modern data stack components (e.g., data lakes/lakehouses, Data Fabric/Mesh architectures, streaming platforms like Kafka/Kinesis, orchestration tools like Airflow, ELT/ETL tools, containerization). Lead the design and implementation of a scalable, resilient, and cost-effective data architecture that meets current and future business needs. (DaaS) Champion and implement DataOps principles, including CI/CD, automated testing, and infrastructure-as-code, to improve development velocity and reliability. Stay abreast of emerging technologies and industry trends, evaluating and recommending new tools and techniques to enhance the platform. Leadership & Strategy: Build, mentor, and lead a world-class data engineering team, fostering a culture of innovation, collaboration, and continuous improvement. Develop and manage the data engineering budget, resources, and vendor relationships. Define the overall data engineering vision, strategy, and multi-year roadmap in alignment with PerkinElmer's business objectives. Effectively communicate strategy, progress, and challenges to executive leadership and key stakeholders across the organization. Drive cross-functional collaboration with IT, Security, Enterprise Apps, R&D, and Business Units. Data Monetization Enablement: Partner closely with business leaders, enterprise app teams, and other business teams to understand data needs and identify opportunities for data monetization. Architect data solutions, APIs, and data products that enable the creation of new revenue streams or significant internal efficiencies derived from data assets. Ensure robust data governance, security, and privacy controls are embedded within the platform design and data products, adhering to relevant regulations (e.g., GDPR, HIPAA where applicable). Build the foundational data infrastructure required to support advanced analytics, machine learning, and AI initiatives. Basic Qualifications Required Qualifications & Experience Bachelor's or Master's degree in Computer Science, Engineering, Information Technology, or a related quantitative field. 15+ years of experience in data engineering, data architecture and/or data warehousing. 5+ years of experience in a leadership role, managing data engineering teams and driving large-scale data initiatives. Proven track record of successfully leading the stabilization, modernization, and scaling of complex data platforms. Deep expertise in modern data architecture patterns (Data Lakes, Data Warehouses, Lakehouses, Lambda/Kappa architectures). Extensive hands-on experience with cloud data platforms (AWS, Azure, or GCP – specify preferred if applicable) and their associated data services (e.g., S3/ADLS/GCS, Redshift/Synapse/BigQuery, EMR/Dataproc/Databricks, Kinesis/Kafka/Event Hubs, Glue/Data Factory/Dataflow). Strong experience with big data technologies (e.g., Spark, Hadoop ecosystem) and data processing frameworks. Proficiency with data pipeline orchestration tools (e.g., Airflow, Prefect, Dagster). Solid understanding of SQL and NoSQL databases, data modeling techniques, and ETL/ELT development. Experience with programming languages commonly used in data engineering (e.g., Python, Scala, Java). Excellent understanding of data governance, data security, and data privacy principles and best practices. Exceptional leadership, communication, stakeholder management, and strategic thinking skills. Demonstrated ability to translate business requirements into technical solutions.
Posted 1 week ago
3.0 years
0 Lacs
Greater Nashik Area
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Title: Data Scientist Location: Bangalore Reporting to: Manager- Analytics/ Senior Manager-Analytics PURPOSE OF THE ROLE Contributing to the Data Science efforts of AB InBevʼs global non-commercial analytics capability of Supply Analytics. Candidate will be required to contribute and may also need to guide the DS team staffed on the area and assess the efforts required to scale and standardize the use of Data Science across multiple ABI markets KEY TASKS AND ACCOUNTABILITIES Understand the business problem and translate that to an analytical problem; participate in the solution design process. Manage the full AI/ML lifecycle, including data preprocessing, feature engineering, model training, validation, deployment, and monitoring. Develop reusable and modular Python code adhering to OOP (Object-Oriented Programming) principles. Design, develop, and deploy machine learning models into production environments on Azure. Collaborate with data scientists, software engineers, and other stakeholders to meet business needs. Ability to communicate findings clearly to both technical and business stakeholders. Qualifications, Experience, Skills Level of educational attainment required (1 or more of the following) B.Tech /BE/ Masters in CS/IS/AI/ML. Previous Work Experience Required Minimum 3 years of relevant experience. Technical Skills Required Must Have Strong expertise in Python, including advanced knowledge of OOP concepts. Exposure to AI/ML methodologies with a previous hands-on experience in ML concepts like forecasting, clustering, regression, classification, optimization using Python. Azure Tech Stack, Databricks, ML Flow in any cloud platform Airflow for orchestrating and automating workflows. MLOPS concepts and containerization tools like Docker. Experience with version control tools such as Git. Consistently display an intent for problem solving. Strong communication skills (vocal and written). Ability to effectively communicate and present information at various levels of an organization. Good To Have Preferred industry exposure in Manufacturing Domain. Product building experience. Other Skills Required Passion for solving problems using data. Detail oriented, analytical and inquisitive. Ability to learn on the go. Ability to work independently and with others. We dream big to create future with more cheers!
Posted 1 week ago
2.0 - 4.0 years
0 Lacs
Gurgaon, Haryana, India
Remote
Data Engineer -I/II - IN (Operations/ Support) Work Timings -24x7 (IST) Work Location - Remote Experience: 2-4 years Job Description Summary The Data engineer is responsible for managing and operating upon Tableau, Tableau bridge server, Databricks , Dbt, SQL , SSRS, SSIS, AWS DWS, AWS APP Flow, PowerBI. The engineer will work closely with the customer and team lead to manage and operate cloud data platform. JOB COMPLEXITY: This role requires extensive problem solving skills and the ability to research an issue, determine the root cause, and implement the resolution; research of various sources such as Databricks/AWS/Tableau documentation that may be required to identify and resolve issues. Must have the ability to prioritize issues and multi-task SUPERVISION: Works under moderate supervision EXPERIENCE/EDUCATION: Requires a Bachelor’s degree in computer science or other related field plus 2-4 years of hands-on experience in configuring and managing Tableau/Databricks and SQL based data analytics solution. Experience with Tableau/Databricks and SQL Datawarehouse environment is desired PHYSICAL DEMANDS: General office environment. No special physical demands required. Schedule flexibility to include working a weekend day regularly and holidays as required by the business for 24/7 operations. Occasional travel, less than 10% POLICY COMPLIANCE: Responsible for adhering to company security policies and procedures and any other relevant policies and standards Knowledge/ Skills Good hands on Tableau, Tableau bridge server, Databricks, SSRS/ SSIS, AWS DWS, AWS APP Flow, PowerBI. Ability to read and write sql and stored procedures. Experience on AWS Good hands on experience in configuring, managing and troubleshooting along with general analytical and problem solving skills. Excellent written and verbal communication skills. Ability to communicate technical info and ideas so others will understand. Ability to successfully work and promote inclusiveness in small groups. Job Responsibilities Troubleshooting incident/problem, includes collecting logs, cross-checking against known issues, investigate common root causes (for example failed batches, infra related items such as connectivity to source, network issues etc.) Knowledge Management: Create/update runbooks as needed / Entitlements Governance: Watch all the configuration changes to batches and infrastructure (cloud platform) along with mapping it with proper documentation and aligning resources Communication: Lead and act as a POC for customer from off-site, handling communication, escalation, isolating issues and coordinating with off-site resources while level setting expectation across stakeholders Change Management: Align resources for on-demand changes and coordinate with stakeholders as required Request Management: Handle user requests – if the request is not runbook-based create a new KB or update runbook accordingly Incident Management and Problem Management, Root cause Analysis, coming up with preventive measures and recommendations such as enhancing monitoring or systematic changes as needed About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know. Apply for this job
Posted 1 week ago
0 years
0 Lacs
Sadar, Uttar Pradesh, India
On-site
Summary: We are seeking a talented and motivated AI Engineer to join our team and focus on building cutting-edge Generative AI applications. The ideal candidate will possess a strong background in data science, machine learning, and deep learning, with specific experience in developing and fine-tuning Large Language Models (LLMs) and Small Language Models (SLMs). You should be comfortable managing the full lifecycle of AI projects, from initial design and data handling to deployment and production monitoring. A foundational understanding of software engineering principles is also required to collaborate effectively with engineering teams and ensure robust deployments. Responsibilities: Design, develop, and implement Generative AI solutions, including applications leveraging Retrieval-Augmented Generation (RAG) techniques. Fine-tune existing Large Language Models (LLMs) and potentially develop smaller, specialized language models (SLMs) for specific tasks. Manage the end-to-end lifecycle of AI model development, including data curation, feature extraction, model training, validation, deployment, and monitoring. Research and experiment with state-of-the-art AI/ML/DL techniques to enhance model performance and capabilities. Build and maintain scalable production pipelines for AI models. Collaborate with data engineering and IT teams to define deployment roadmaps and integrate AI solutions into existing systems. Develop AI-powered tools to solve business problems, such as summarization, chatbots, recommendation systems, or code assistance. Stay updated with the latest advancements in Generative AI, machine learning, and deep learning. Qualifications: Proven experience as a Data Scientist, Machine Learning Engineer, or AI Engineer with a focus on LLMs and Generative AI. Strong experience with Generative AI techniques and frameworks (e.g., RAG, Fine-tuning, Langchain, LlamaIndex, PEFT, LoRA). Solid foundation in machine learning (e.g., Regression, Classification, Clustering, XGBoost, SVM) and deep learning (e.g., ANN, LSTM, RNN, CNN) concepts and applications. Proficiency in Python and relevant libraries (e.g., Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch). Experience with data science principles, including statistics, hypothesis testing, and A/B testing. Experience deploying and managing models in production environments (e.g., using platforms like AWS, Databricks, MLFlow). Familiarity with data handling and processing tools (e.g., SQL, Spark/PySpark). Basic understanding of software engineering practices, including version control (Git) and containerization (Docker). Bachelor's or master’s degree in computer science, Artificial Intelligence, Data Science, or a related quantitative field. Preferred Skills: Experience building RAG-based chatbots or similar applications. Experience developing custom SLMs. Experience with MLOps principles and tools (e.g., MLFlow, Airflow). Experience migrating ML workflows between cloud platforms. Familiarity with vector databases and indexing techniques. Experience with Python web frameworks (e.g., Django, Flask). Experience building and integrating APIs (e.g., RESTful APIs). Basic experience with front-end development or UI building for showcasing AI applications. Qualifications Bachelorʼs or Masterʼs degree in Computer Science, Engineering, or a related discipline.
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-6 Years Of Relevant Work Experience Is Required. Experience with stakeholder management is an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: · Design, develop, and maintain scalable data pipelines using Azure data services such as Azure Data Factory and Apache Spark. · Implement efficient Extract, Transform, Load (ETL) processes to move and transform data across various sources. · Design, develop, and maintain data solutions using Azure Synapse Analytics. · Implement data ingestion, transformation, and extraction processes using Azure Synapse Pipelines. · Knowledge about data warehousing concepts · Utilize Azure SQL Database, Azure Blob Storage, Azure Data Lake Storage, and other Azure data services to store and retrieve data. · Performance optimization and troubleshooting capabilities · Advanced SQL knowledge, capable to write optimized queries for faster data workflows. · Proven work experience in Spark, Python, SQL, Any RDBMS. · Experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS · Must be extremely well versed with handling large volume data and work using different tools to derive the required solution. Mandatory skill sets: Azure Databricks, Azure Data Factory (ADF), or Azure Synapse Analytics, along with Python and SQL expertise Preferred skill sets: · Experienced in Delta Lake, Power BI, or Azure DevOps. · Knowledge of Databricks will be a plus · Knowledge of Spark, Scala, or other distributed processing frameworks. · Exposure to BI tools like Power BI, Tableau, or Looker. · Familiarity with data security and compliance in the cloud. · Experience in leading a development team. Years of experience required: 4 – 7 yrs Education qualification: B.tech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Technology Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Azure Synapse Analytics, Databricks Platform Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 1 week ago
11.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
As a Principal Software Engineer for Data, the person will lead the design and implementation of scalable, secure, and high-performance data pipelines across that involves healthcare clinical data, using modern big data and cloud technologies (Azure, Databricks, and Spark), ensuring alignment with UnitedHealth Group’s data governance standards. This role requires a hands-on leader who can write and review code, mentor teams, and collaborate across business and technical stakeholders to drive data strategy and innovation. The person needs to be ready to take up AI and AIOps as part of their work and support the data science teams with ideas and review their work. Primary Responsibilities Architecture Design and lead the implementation of robust, scalable, and secure data architectures for clinical and healthcare data for batch and real time pipelines. Architect end-to-end data pipelines using big data and cloud-native technologies (e.g., Spark, Databricks, Azure Data Factory). Ensure data solutions meet performance, scalability, and compliance requirements, including HIPAA and internal governance policies. Have good experience with designing, evolving and reviewing database schema. Experience with schema management for unstructured data, structured data, relational, star schema. Experience with designing and managing semantic data elements (metadata, configuration, master data). Come up with automated pipelines to keep them up-to-date from upstream sources. Build and optimize data ingestion, transformation, and storage pipelines for structured and unstructured clinical data. Guide teams that are doing it and ensure support for incremental data processing. Ensure data quality, lineage is embedded in all solutions. Lead code reviews, proof-of-concepts, and performance tuning for large-scale data systems. Collaborate with data governance teams to ensure adherence to UHG and healthcare data standards, lineage, certification, Data use rights, and data privacy. Contribute to the maturity of data governance domains and participate in governance councils and working groups. Design, Build and monitor MLOps pipelines, model inference and robust piplelines for running AI operations on data. Secondary Responsibilities Mentor data engineers and analysts, fostering a culture of technical excellence and continuous learning. Collaborate with product managers, data scientists, and business stakeholders to translate requirements into data solutions. Influence architectural decisions across teams and contribute to enterprise-wide data strategy. Stay current with emerging technologies in cloud, big data, and AI/ML, and evaluate their applicability to healthcare data. Promote the use of generative AI tools (e.g., GitHub Copilot) to enhance development productivity and innovation. Drive adoption of DevOps and DataOps practices, including CI/CD, IaC, and automated testing for data pipelines. Required Skills & Qualifications Technical skills Ideally 11+ years of experience in data architecture, data engineering, or related roles, with a focus on healthcare or clinical data preferred. Proven track record of designing and delivering large-scale data solutions in cloud environments. Cloud Platforms: Strong experience with Azure (preferred), AWS, or GCP. Big Data Technologies: Proficient in Apache Spark, Databricks, Delta Lake, and distributed data processing. Data Engineering: Expertise in building ETL/ELT pipelines, data lakes, and real-time streaming architectures using python, scala or other comparable technologies. Data Modelling: Deep understanding of dimensional modeling, canonical models, and healthcare data standards (e.g., HL7, FHIR). Programming: Proficiency in Python, SQL, and optionally Scala or Java. DevOps/DataOps: Familiarity with CI/CD, IaC (Terraform, ARM) Soft Skills Strong leadership, communication, and stakeholder management skills. Ability to mentor and influence across teams and levels. Strategic thinker with a passion for data-driven innovation. Ability to get into details whenever required and spend time in understanding and solving problems. Preferred Skills Experience with healthcare data interoperability standards (FHIR, HL7, CCD). Familiarity with MLOps and integrating data pipelines with ML workflows. Contributions to open-source projects or publications in data architecture or healthcare analytics.
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
P2-C3-TSTS AWS Data Engineer Design and build scalable data pipelines using AWS services like AWS Glue, Amazon Redshift, and S3. Develop efficient ETL processes for data extraction, transformation, and loading into data warehouses and lakes. Create and manage applications using Python, SQL, Databricks, and various AWS technologies. Automate repetitive tasks and build reusable frameworks to improve efficiency. AWS Data Engineer Design and build scalable data pipelines using AWS services like AWS Glue, Amazon Redshift, and S3. Develop efficient ETL processes for data extraction, transformation, and loading into data warehouses and lakes. Create and manage applications using Python, SQL, Databricks, and various AWS technologies. Automate repetitive tasks and build reusable frameworks to improve efficiency. Skill Proficiency Level expected AWS Data Engineer - AWS Glue, Amazon Redshift, S3 ETL Process , SQl, Databricks
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
WorkMode :Hybrid Work Location : Chennai / Hyderabad / Bangalore / Pune / mumbai / gurgaon Work Timing : 2 PM to 11 PM Primary : Data Engineer Design and build scalable data pipelines using AWS services like AWS Glue, Amazon Redshift, and S3. Develop efficient ETL processes for data extraction, transformation, and loading into data warehouses and lakes. Create and manage applications using Python, SQL, Databricks, and various AWS technologies. Automate repetitive tasks and build reusable frameworks to improve efficiency. AWS Data Engineer Design and build scalable data pipelines using AWS services like AWS Glue, Amazon Redshift, and S3. Develop efficient ETL processes for data extraction, transformation, and loading into data warehouses and lakes. Create and manage applications using Python, SQL, Databricks, and various AWS technologies. Automate repetitive tasks and build reusable frameworks to improve efficiency. Skill Proficiency Level expected AWS Data Engineer - AWS Glue, Amazon Redshift, S3 ETL Process , SQl, Databricks
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
WorkMode :Hybrid Work Location : Chennai / Hyderabad / Bangalore / Pune / mumbai / gurgaon Work Timing : 2 PM to 11 PM Primary : Data Engineer AWS Data Engineer Design and build scalable data pipelines using AWS services like AWS Glue, Amazon Redshift, and S3. Develop efficient ETL processes for data extraction, transformation, and loading into data warehouses and lakes. Create and manage applications using Python, SQL, Databricks, and various AWS technologies. Automate repetitive tasks and build reusable frameworks to improve efficiency. AWS Data Engineer Design and build scalable data pipelines using AWS services like AWS Glue, Amazon Redshift, and S3. Develop efficient ETL processes for data extraction, transformation, and loading into data warehouses and lakes. Create and manage applications using Python, SQL, Databricks, and various AWS technologies. Automate repetitive tasks and build reusable frameworks to improve efficiency. Skill Proficiency Level expected AWS Data Engineer - AWS Glue, Amazon Redshift, S3 ETL Process , SQl, Databricks
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France