Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description The Smart Cube, a WNS company, is a trusted partner for high performing intelligence that answers critical business questions. And we work with our clients to figure out how to implement the answers, faster. Job Description Roles and ResponsibilitiesAssistant Managers must understand client objectives and collaborate with the Project Lead to design effective analytical frameworks. They should translate requirements into clear deliverables with defined priorities and constraints. Responsibilities include managing data preparation, performing quality checks, and ensuring analysis readiness. They should implement analytical techniques and machine learning methods such as regression, decision trees, segmentation, forecasting, and algorithms like Random Forest, SVM, and ANN.They are expected to perform sanity checks and quality control of their own work as well as that of junior analysts to ensure accuracy. The ability to interpret results in a business context and identify actionable insights is critical. Assistant Managers should handle client communications independently and interact with onsite leads, discussing deliverables and addressing queries over calls or video conferences.They are responsible for managing the entire project lifecycle from initiation to delivery, ensuring timelines and budgets are met. This includes translating business requirements into technical specifications, managing data teams, ensuring data integrity, and facilitating clear communication between business and technical stakeholders. They should lead process improvements in analytics and act as project leads for cross-functional coordination.Client ManagementThey serve as client leads, maintaining strong relationships and making key decisions. They participate in deliverable discussions and guide project teams on next steps and execution strategy.Technical RequirementsAssistant Managers must know how to connect databases with Knime (e.g., Snowflake, SQL) and understand SQL concepts such as joins and unions. They should be able to read/write data to and from databases and use macros and schedulers to automate workflows. They must design and manage Knime ETL workflows to support BI tools and ensure end-to-end data validation and documentation.Proficiency in PowerBI is required for building dashboards and supporting data-driven decision-making. They must be capable of leading analytics projects using PowerBI, Python, and SQL to generate insights. Visualizing key findings using PowerPoint or BI tools like Tableau or Qlikview is essential.Ideal CandidateCandidates should have 4–7 years of experience in advanced analytics across Marketing, CRM, or Pricing in Retail or CPG. Experience in other B2C domains is acceptable. They must be skilled in handling large datasets using Python, R, or SAS and have worked with multiple analytics or machine learning techniques. Comfort with client interactions and working independently is expected, along with a good understanding of consumer sectors such as Retail, CPG, or Telecom.They should have experience with various data formats and platforms including flat files, RDBMS, Knime workflows and server, SQL Server, Teradata, Hadoop, and Spark—on-prem or in the cloud. Basic knowledge of statistical and machine learning techniques like regression, clustering, decision trees, forecasting (e.g., ARIMA), and other ML models is required.Other SkillsStrong written and verbal communication is essential. They should be capable of creating client-ready deliverables using Excel and PowerPoint. Knowledge of optimization methods, supply chain concepts, VBA, Excel Macros, Tableau, and Qlikview will be an added advantage. Qualifications Engineers from top tier institutes (IITs, DCE/NSIT, NITs) or Post Graduates in Maths/Statistics/OR from top Tier Colleges/UniversitiesMBA from top tier B-schools
Posted 2 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About The Role Uber's data infrastructure is composed of a wide variety of compute engines, scheduling/execution solutions, and storage solutions. Compute engines such as Apache Spark™, Presto®, Apache Hive™, Neutrino, Apache Flink®, etc., allow Uber to run petabyte-scale operations on a daily basis. Further, scheduling and execution engines such as Piper (Uber's fork of Apache Airflow™), Query Builder (user platform for executing compute SQLs), Query Runner (proxy layer for execution of workloads), and exist to allow scheduling and execution of compute workloads. Finally, a significant portion of storage is supported by HDFS, Google Cloud Storage (GCS),Apache Pinot™, ElasticSearch®, etc. Each engine supports thousands of executions, which are owned by multiple owners and sub-teams. With such a complex and diverse big data landscape operating at petabyte-scale and around a million applications/queries running each day, it's imperative to provide the stakeholders a holistic view of the right performance and resource consumption insights. DataCentral, is a comprehensive platform that provides users with essential insights into big data applications and queries. It empowers data platform users by offering detailed information on workflows and apps, improving productivity by reducing debugging time and improving the cost efficiency by providing detailed resource efficiency insights As an engineer in the Data Central Team, you will be solving some of the most complex problems in Observability and efficiency of Distributed Data Systems at Uber scale. What You'll Do Work with Uber data science and engineering teams to improve Observability of Batch Data use-cases at Uber. Leverage knowledge of spark internals to dramatically help improve customer's Spark job performance. Design and implement AI based solutions to improve the application debuggability. Design and implement algorithms to optimize Resource consumption without impacting reliability Design and develop prediction and forecasting models to proactively predict system degradations and failures Work with multiple partner teams within and Uber and build cross-functional solutions in a collaborative work environment. Work with the community to upstream Uber's contributions to open source and also keep our internal fork up to date What You'll Need Bachelor's degree in Computer Science or related field. 5+ years of experience building large scale distributed software systems. Solid understanding of Java for backend / systems software development. MS / PhD in Computer Science or related field. Experience managing production systems with a strong availability SLA. Experience working with Apache Spark or similar analytics technologies. Experience working with large scale distributed systems, HDFS / Yarn. Experience working with SQL Compiler, SQL Plan / Runtime Optimization. Experience working with Kubernetes
Posted 2 days ago
6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do The Finance and Procurement Product Portfolio plays a critical role at the intersection of BCG’s Global Finance and IT functions. Its purpose is to deliver best-in-class financial systems that enable the organization to transact efficiently and responsibly, capture, enrich, and report core financial data to BCG’s business leaders, and comply with the complex fiscal requirements of a multinational organization. The portfolio also ensures that its platforms provide exceptional user experience, aligns with BCG’s global IT enterprise architecture and security standards, and minimizes the total cost of operations. It embraces the latest technologies and Agile ways of working to enhance financial systems and deliver business value at scale. BCG’s global finance function is undergoing a major transformation, with technology serving as a core enabler. A key part of this transformation is the deployment of SAP S/4HANA Public Cloud as the firm’s global ERP platform. This new platform creates exciting leadership opportunities, including the Data Integrity and Report Enablement Product Analyst role. In this role, you will be a part of a squad with a defined mandate focused on data quality and reporting. You will play a role ensuring that data flows seamlessly between the BCG’s data layer and core BCG platforms (e.g., Workday), while meeting standards for accuracy and completeness. You will also help leverage your knowledge of SAP’s native capabilities to help local finance teams in 60+ markets to access and build the reports they need to fulfill statutory and compliance requirements. The Product Analyst is crucial to driving key operations activities and innovation within the squad. Among Your Responsibilities, You Will Squad operations & Delivery Support Collaborate with the Product Owner to shape and prioritize the squad’s backlog, ensuring alignment with business needs Represent the Product Owner and collaborate with stakeholders to define business requirements and translate them into clear, actionable user stories or functional specifications to guide the engineering teams on work to be performed Support the full delivery lifecycle, including coordination of testing, issue resolution, and deployment of new features and data updates Support change management efforts through creation of documentation, training content, and communications for new tools or processes. Champion an Agile mindset across all squad activities, promoting iterative delivery, continuous feedback, and a focus on delivering value early and often Data Quality & Integrations Develop a strong understanding of core financial data in SAP, how it is structured, and how it links across BCG systems (e.g., enterprise data layer) to support accurate and scalable data flows. Partner with other portfolios and integrations chapter to design and validate new data flows Monitor the accuracy, completeness, and consistency of data flowing between BCG’s enterprise data layer and key platforms, identifying and addressing any data quality gaps. Investigate data issues or inconsistencies, perform root cause analysis, and work collaboratively with the engineering teams to implement sustainable solutions. Help define and implement proactive data quality controls and business rules that improve data reliability and reduce manual intervention. Compliance and Statutory report enablement Build strong, collaborative relationships with stakeholders and champion a self-service, sustainable reporting model. Engage with stakeholders to understand reporting needs and guide them in using SAP’s native capabilities, providing coaching or documentation to enable self-service reporting. When needs cannot be met through SAP, define requirements and collaborate with the squad to build fit-for-purpose reports using alternative reporting tools. Support the development and testing of custom reports to ensure they meet stakeholder expectations and functional requirements. Facilitate the successful handoff of custom-built reports to end users, enabling them to maintain and evolve them independently. YOU’RE GOOD AT Applying critical thinking to analyze complex financial data and identify patterns, gaps, and opportunities for improvement. Collaborating effectively with cross-functional teams to translate business needs into clear, actionable requirements. Managing multiple priorities and tasks with strong attention to detail and a focus on timely delivery. Communicating clearly and proactively with team members, stakeholders, and end users. Working independently to investigate and resolve issues, while knowing when to escalate for support. Embracing an Agile mindset and thriving in a fast-paced, iterative work environment. What You'll Bring A strong foundation in financial processes and a passion for working with financial systems and data. Bachelor’s degree in Finance, Accounting, Information Systems, Business Administration, or a related field. 6 years of relevant experience in finance, accounting, data analysis, or systems support. Familiarity with ERP platforms (such as SAP S/4HANA) and financial reporting tools. Understanding of Agile ways of working and the ability to contribute within a squad-based delivery model. Excellent organizational skills, with the ability to structure and document complex information clearly. Bachelor’s degree in Finance, Accounting, Information Systems, Business Administration, or a related field. Who You'll Work With The squad Product Owner who will be your line manager All squad members including technical engineers, quality assurance engineers Internal clients including functional Business Process Owners, translating their voice and needs into user stories and engaging users as needed A Scrum Lead, who will remove impediments and will assist you in preparing the required artifacts and managing ceremonies Other Product Owners and Product Analysts within BCG, to share best practices and ensure alignment between squads and culture Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
Posted 2 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing... We are seeking a visionary and technically strong Senior AI Architect to join our Billing IT organization in driving innovation at the intersection of telecom billing, customer experience, and artificial intelligence. This leadership role will be pivotal in designing, developing, and scaling AI-led solutions that redefine how we bill our customers, improve their billing experience, and derive actionable insights from billing data. You will work closely with cross-functional teams to lead initiatives that transform customer-facing systems, backend data platforms, and software development practices through modern AI technologies. Key Responsibilities Customer Experience Innovation: Designing and implementing AI-driven enhancements to improve telecom customer experience, particularly in the billing domain. Leading end-to-end initiatives that personalize, simplify, and demystify billing interactions for customers. AI Tools and Platforms: Evaluating and implementing cutting-edge AI/ML models, LLMs, SLMs, and AI-powered solutions for use across the billing ecosystem. Developing prototypes and production-grade AI tools to solve real-world customer pain points. Prompt Engineering & Applied AI: Exhibiting deep expertise in prompt engineering and advanced LLM usage to build conversational tools, intelligent agents, and self-service experiences for customers and support teams. Partnering with design and development teams to build intuitive AI interfaces and utilities. AI Pair Programming Leadership: Demonstrating hands-on experience with AI-assisted development tools (e.g., GitHub Copilot, Codeium). Driving adoption of such tools across development teams, track measurable productivity improvements, and integrate into SDLC pipelines. Data-Driven Insight Generation: Leading large-scale data analysis initiatives using AI/ML methods to generate meaningful business insights, predict customer behavior, and prevent billing-related issues. Establishing feedback loops between customer behavior and billing system design. Thought Leadership & Strategy: Acting as a thought leader in AI and customer experience within the organization. Staying abreast of trends in AI and telecom customer experience; regularly benchmark internal initiatives with industry best practices. Architectural Excellence: Owning and evolve the technical architecture of AI-driven billing capabilities, ensuring scalability, performance, security, and maintainability. Collaborating with enterprise architects and domain leads to align with broader IT and digital transformation goals. Telecom Billing Domain Expertise: Bring deep understanding of telecom billing functions, processes, and IT architectures, including usage processing, rating, billing cycles, invoice generation, adjustments, and revenue assurance. Providing architectural guidance to ensure AI and analytics solutions are well integrated into core billing platforms with minimal operational risk. Where you'll be working... In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. What We’re Looking For... You’re energized by the prospect of putting your advanced expertise to work as one of the most senior members of the team. You’re motivated by working on groundbreaking technologies to have an impact on people’s lives. You’ll Need To Have Bachelor’s degree or four or more years of work experience. Six or more years of relevant experience required, demonstrated through one or a combination of work Strong understanding of AI/ML concepts, including generative AI, LLMs (Large Language Models) etc with the ability to evaluate and apply them to solve real-world problems in telecom and billing. Familiarity with industry-leading AI models and platforms (e.g., OpenAI GPT, Google Gemini, Microsoft Phi, Meta LLaMA, AWS Bedrock), and understanding of their comparative strengths, pricing models, and applicability. Ability to scan and interpret AI industry trends, identify emerging tools, and match them to business use cases (e.g., bill explainability, predictive analytics, anomaly detection, agent assist). Skilled in adopting and integrating third-party AI tools—rather than building from scratch—into existing IT systems, ensuring fit-for-purpose usage with strong ROI. Experience working with AI product vendors, evaluating PoCs, and influencing make-buy decisions for AI capabilities. Comfortable guiding cross-functional teams (tech, product, operations) on where and how to apply AI tools, including identifying appropriate use cases and measuring impact. Deep expertise in writing effective and optimized prompts across various LLMs. Knowledge of prompt chaining, tool-use prompting, function calling, embedding techniques, and vector search optimization. Ability to mentor others on best practices for LLM prompt engineering and prompt tuning. In-depth understanding of telecom billing functions: mediation, rating, charging, invoicing, adjustments, discounts, taxes, collections, and dispute management. Strong grasp of billing SLAs, accuracy metrics, and compliance requirements in a telcom environment. Proven ability to define and evolve cloud-native, microservices-based architectures with AI components. Deep understanding of software engineering practices including modular design, API-first development, testing automation, and observability. Experience in designing scalable, resilient systems for high-volume data pipelines and customer interactions. Demonstrated hands-on use of tools like GitHub Copilot, Codeium, AWS CodeWhisperer, etc. Strong track record in scaling adoption of AI pair programming tools across engineering teams. Ability to quantify productivity improvements and integrate tooling into CI/CD pipelines. Skilled in working with large-scale structured and unstructured billing and customer data. Proficiency in tools like SQL, Python (Pandas, NumPy), Spark, and data visualization platforms (e.g., Power BI, Tableau). Experience designing and operationalizing AI/ML models to derive billing insights, detect anomalies, or improve revenue assurance. Excellent ability to translate complex technical concepts to business stakeholders. Influential leadership with a track record of driving innovation, change management, and cross-functional collaboration. Ability to coach and mentor engineers, analysts, and product owners on AI technologies and best practices. Keen awareness of emerging AI trends, vendor platforms, open-source initiatives, and market best practices. Active engagement in AI communities, publications, or proof-of-concept experimentation. Even better if you have one or more of the following: A master’s degree If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.
Posted 2 days ago
10.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do The Finance and Procurement Product Portfolio plays a critical role at the intersection of BCG’s Global Finance and IT functions. Its purpose is to deliver best-in-class financial systems that enable the organization to transact efficiently and responsibly, capture, enrich, and report core financial data to BCG’s business leaders, and comply with the complex fiscal requirements of a multinational organization. The portfolio also ensures that its platforms provide exceptional user experience, align with BCG’s global IT enterprise architecture and security standards, and minimize the total cost of operations. It embraces the latest technologies and Agile ways of working to enhance financial systems and deliver business value at scale. BCG’s global finance function is undergoing a major transformation, with technology serving as a core enabler. A key part of this transformation is the deployment of SAP S/4HANA Public Cloud as the firm’s global ERP platform. This new platform creates exciting leadership opportunities, including the Data Integrity and Report Enablement Product Owner role. In this role, you will shape the squad’s vision, priorities, and tools while operating within a clearly defined mandate focused on data quality and reporting. As BCG transitions to SAP, the squad will play a critical role in understanding how financial data is captured and structured in the new ERP. You will work across portfolios to ensure that data flows seamlessly between the data layer and core BCG platforms such as Workday, while meeting standards for accuracy and completeness. You will also help leverage SAP’s native capabilities to empower local finance teams in more than 60 markets to access and build the reports they need to fulfill statutory and compliance requirements. To deliver results, you will engage with key stakeholders, including the SAP Product Lead and functional Business Process Owners, to understand business priorities, translate them into a product roadmap, prioritize a backlog of enhancements, and build business cases for potential investment. Leveraging your expertise in finance, accounting, and SAP, and following Agile principles, you will break down enhancements into meaningful segments of work for your squad to deliver and will track progress toward the desired outcomes. Among Your Responsibilities, You Will Deliver business results and customer value Operationalize the product vision by connecting the dots between high-level platform goals and specific initiatives Track product performance to inform future work Deliver on specific and measurable KPIs to be defined by your squad Support the SAP Product Lead in engaging with the funding process and manage the relevant product budget Serve as the voice of the internal client Support and enable the Squad to get its work done Regularly engage with the Squad to offer feedback on work-in-progress and clarify requirements Engage with Chapter Leads regarding resourcing and technical expertise required in Squad Provide feedback as part of performance management of Squad members and other members of the Portfolio Set an overall vision to direct and inform the Squad’s work Work closely with the SAP Product Lead and Product Portfolio Lead to understand and drive alignment on Portfolio’s business strategy, goals, and objectives Translate Portfolio objectives into a clear vision (e.g., via KPIs, sprint goals) for your Squad to influence the creation and prioritization of the Squad's backlog of work Share information about the Squad’s output and priorities with other Product Owners to ensure alignment across the organization Enable the organization’s way of working Actively create and maintain a Squad culture based on BCG’s values and Agile behaviors Model behaviors to support the organization’s adoption of new ways of working including how AI can enhance productivity Support Product Analyst growth and development through informal and formal feedback as part of BCG’s performance management process YOU’RE GOOD AT Critical thinking and balancing information from multiple sources (technical and functional) to guide the squad to the correct outcome Applying a consultative approach to interactions with the squad and stakeholders to build strong relationships and trust Being customer-focused and dedicated to understanding and learning about customer needs and requirements Operating with a transparency mindset, communicating clearly and openly Working with ambiguous requirements and multi-disciplinary teams Influencing stakeholders up to the senior levels of the organization Bringing a data-driven approach to decision making, both in day-to-day management and in making strategic trade-offs Looking for opportunities to innovate and get things done better and faster What You'll Bring Demonstrated experience as a Product Owner A passion for financial data management and learning new financial reporting tools 10 years’ relevant experience in a global finance organization Bachelor’s degree in Finance, Accounting, Information Systems, Business Administration, or a related field Understanding of Agile principles and ways of working Divergent thinker who can converge ideas into tangible products Exceptional communications and stakeholder management skills Experience in consulting or professional services is a plus Who You'll Work With All members of your squad, for whom you will be their servant leader Product Analyst who will report to you and support you in your role Internal clients including functional Business Process Owners, translating their voice and needs into user stories and engaging users as needed Product Lead who will be your line manager and coordinate work across SAP squads Product Portfolio Lead who will set the vision, roadmap, budget, priorities and OKRs for the Portfolio and subsequently, for the squads Scrum Leads, who will act as your right hand to remove impediments and will assist you in preparing the required artifacts and managing ceremonies Chapter Leads and Technical Lead for technical solutioning and delivery Agile Coaches, with whom you will share passion for Agile ways of working Other Product Owners within BCG, to share best practices and ensure alignment between squads and culture Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
Posted 2 days ago
6.0 - 8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As a consultant at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Preferred Education Master's Degree Required Technical And Professional Expertise 6-8 years of overall IT experience with minimum 4 years in python development Has good experience on Python with Spark to write reusable codes and framework Write structured, clean, reusable, and testable code using Python Should have good understanding of Database design with ability to write complex Sql queries Excellent knowledge on python and API frameworks (Django. Flask) Implement well-designed, high-performance applications for the server-side Knowledge of the threading functions of Python Preferred Technical And Professional Experience Should have good understanding of Database design with ability to write complex Sql queries Excellent knowledge on python and API frameworks (Django. Flask) Implement well-designed, high-performance applications for the server-side Knowledge of the threading functions of Python
Posted 2 days ago
6.0 - 10.0 years
16 - 25 Lacs
Vadodara
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 2 days ago
6.0 - 10.0 years
16 - 25 Lacs
Agra
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 2 days ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Analyze business requirements and functional specifications Be able to determine the impact of changes in current functionality of the system Interaction with diverse Business Partners and Technical Workgroups Be flexible to collaborate with onshore business, during US business hours Be flexible to support project releases, during US business hours Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience 3+ years on working experience in Python, PySpark, Scala 3+ years of experience working on MS SQL Server and NoSQL DBs like Cassandra, etc. Hands on working experience in Azure Databricks Exposure to following DevOps methodology and creating CI/CD deployment pipeline Exposure to following Agile methodology specifically using tools like Rally. Solid healthcare domain knowledge Ability to understand the existing application codebase, perform impact analysis and update the code when required based on the business logic or for optimization. Proven excellent analytical and Communication skills (Both Verbal and Written) Preferred Qualification Experience in the Streaming application (Kafka, Spark Streaming, etc.) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.
Posted 2 days ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description As a Research Analyst, you'll collaborate with experts to develop advance machine learning solutions for business needs. You'll drive product pilots, demonstrating innovative thinking and customer focus. You'll build scalable solutions, write high-quality code, and develop state-of-the-art ML models. You'll coordinate between science and software teams, optimizing solutions. The role requires thriving in ambiguous, fast-paced environments and working independently with ML models. Key job responsibilities Collaborate with Applied Scientists to implement ML/LLM solutions that meet business goals Conduct product pilots demonstrating customer obsession and innovation Develop scalable solutions by writing high-quality code, building ML/LLM models using current research breakthroughs and implementing performance optimization techniques Act as a bridge between science and software teams to deliver optimized solutions Communicate technical concepts to stakeholders at all levels Develop technical documentation for Design specifications, Algorithms, Implementation challenges and Performance metrics Monitor and maintain existing solutions to ensure peak performance About The Team The Retail Business Systems (RBS) group is an integral part of Amazon online product lifecycle and buying operations. The team is designed to ensure Amazon remains competitive in the online retail space with the best catalog quality, wide selection, supply chain defects and compliance programs. The team’s primary role is to create and enhance retail selection on the worldwide Amazon online catalog. The tasks handled have a direct impact on customer buying decisions and online user experience. Basic Qualifications Bachelor's degree in Quantitative or STEM disciplines (Science, Technology, Engineering, Mathematics) 1+ years of relevant work experience in solving real world business problems using machine learning, deep learning, data mining and statistical algorithms Strong hands-on programming skills in Python, SQL, Hadoop/Hive. Additional knowledge of Spark, Scala, R, Java desired but not mandatory Strong analytical thinking Ability to creatively solve business problems, innovating new approaches where required and articulating ideas to a wide range of audiences using strong data, written and verbal communication skills Ability to collaborate effectively across multiple teams and stakeholders, including development teams, product management and operations. Preferred Qualifications Master's degree with specialization in ML, NLP or Computer Vision preferred 1+ years relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) Diverse experience will be favored eg. a mix of experience across different roles - In-depth understanding of machine learning concepts including developing models and tuning the hyper-parameters, as well as deploying models and building ML service - Technical expertise, experience in Data science, ML and Statistics Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A3017880
Posted 2 days ago
7.0 - 9.0 years
8 - 14 Lacs
Agra
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 2 days ago
7.0 - 9.0 years
8 - 14 Lacs
Vadodara
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 2 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Spark, Java Strong SQL writing skills, data discovery, data profiling, Data exploration, Data wrangling skills Kafka, AWS s3, lake formation, Athena, glue, Autosys or similar tools, FastAPI (secondary) Strong SQL skills to support data analysis and imbedded business logic in SQL, data profiling and gap assessment Collaborate with development and business SMEs within technology to understand data requirements, perform data analysis to support and Validate business logic, data integrity and data quality rules within a centralized data platform Experience working within the banking/financial services industry with solid understanding of financial products and business processes
Posted 2 days ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Preferred Education Master's Degree Required Technical And Professional Expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred Technical And Professional Experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc
Posted 2 days ago
3.0 - 8.0 years
2 - 5 Lacs
Bharuch
Hybrid
Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days
Posted 2 days ago
3.0 - 8.0 years
2 - 5 Lacs
Surendranagar
Hybrid
Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days
Posted 2 days ago
3.0 - 8.0 years
2 - 5 Lacs
Mehsana
Hybrid
Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days
Posted 2 days ago
3.0 - 8.0 years
2 - 5 Lacs
Vadodara
Hybrid
Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days
Posted 2 days ago
3.0 - 8.0 years
2 - 5 Lacs
Surat
Hybrid
Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days
Posted 2 days ago
3.0 - 8.0 years
2 - 5 Lacs
Rajkot
Hybrid
Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days
Posted 2 days ago
3.0 - 8.0 years
2 - 5 Lacs
Gandhinagar
Hybrid
Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days
Posted 2 days ago
3.0 - 8.0 years
2 - 5 Lacs
Bhavnagar
Hybrid
Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days
Posted 2 days ago
3.0 - 8.0 years
2 - 5 Lacs
Jamnagar
Hybrid
Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days
Posted 2 days ago
3.0 - 8.0 years
2 - 5 Lacs
Ahmedabad
Hybrid
Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days
Posted 2 days ago
3.0 - 8.0 years
2 - 5 Lacs
Nagapattinam
Hybrid
Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France