Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 7.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do As a Senior People Analyst, you’ll support evidenced-based decision making. You’ll provide analytical input to support the Global People team and work with stakeholders across BCG. You’ll add a quantitative perspective to discussions on new and existing HR processes and procedures You will apply an analytics mindset and empower internal clients with dashboards, data, and reports to help improve processes, and solve people related challenges, to provide thought leadership on the complete cycle of talent analytics, from sourcing the candidates to managing attrition Key Responsibilities Develop, design, and manage advanced Tableau dashboards that integrate data from diverse sources Use SQL to query databases and retrieve relevant data for analysis and reporting Perform data extraction, transformation, and loading (ETL) to create efficient and scalable data models Thought partner with internal stakeholders on various people related challenges, by developing domain expertise Ensure data accuracy and consistency through rigorous testing and quality checks Collaborate with cross-functional teams to gather requirements and understand data sources You are good at Providing analytical support in metrics, reporting, and dashboard development Leading technical aspects of a large project with minimal supervision Generating insights from large and complex datasets, and understanding the nuances and inconsistencies in data Ability to multi-task and operate effectively in a fast-paced and customer-oriented environment; ability to manage multiple stakeholders in a matrix organization Communicating and presenting technical details to non-technical stakeholders Strong interpersonal skills, who showcases credibility and excels in a collaborative setting What You'll Bring Undergraduate degree, preferably in an engineering or other technology-related fields, with high academic achievement required; advanced degree preferred MUST HAVES: 5-7 years of full time Tableau dashboard development, data modeling, and SQL language expertise Advanced Tableau experience, including Tableau server management, level of detail calculations, built custom charts, hyper data source, Javascript APIs Strong understanding of UX/UI principles for creating intuitive and visually impactful Tableau dashboards Knowledge in SQL for querying databases, optimizing data retrieval, and supporting data-driven decision-making Basic knowledge of Microsoft Excel, with skills in data manipulation, including sorting, filtering, and using formulas to analyze and organize complex data sets Background in HR data analysis and HR domain knowledge is preferred, however not mandatory Deep interest and aptitude in data, metrics, and analysis Who You'll Work With As part of the People analytics team, you will modernize HR platforms, capabilities & engagement, automate/digitize core HR processes and operations and enable greater efficiency. You will collaborate with the global people teams and colleagues across BCG to manage the life cycle of all BCG employees. The People Management Team (PMT) is comprised of several centers of expertise including HR Operations, People Analytics, Career Development, Learning & Development, Talent Acquisition & Branding, Compensation, and Mobility. Our centers of expertise work together to build out new teams and capabilities by sourcing, acquiring and retaining the best, diverse talent for BCG. We develop talent and capabilities, while enhancing managers’ effectiveness, and building affiliation and engagement in our global offices. The PMT also harmonizes process efficiencies, automation, and global standardization. Through analytics and digitalization, we are always looking to expand our PMT capabilities and coverage. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
Posted 1 day ago
8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Role: Digital Marketing Manager / Global Marketing Manager Location : Onsite/Gurgaon/ Full Time Shift : US (CST 5 PM to 2 AM) Who We Are: This is Spearhead Technology — where every challenge is an opportunity, and every solution is a masterpiece in the making. As a full-lifecycle IT company, we transcend mere delivery; we engineer success. From inception to implementation, our seasoned expertise shepherds every phase of the journey. Be it planning, analysis, design, development, testing, or the seamless transition to production, we stand as steadfast partners in our clients’ progress. At Spearhead Technology, quality isn't a mere aspiration—it's our ethos. Rooted in Tech Advisory, our methodology is guided by insights that spark transformative outcomes. We recognize the paramount importance of talent retention. Through a steadfast commitment to work-life balance, competitive remuneration packages, and an optimized operational model, we ensure our team remains as exceptional as our services. Step into Spearhead Technology, where innovation meets precision, and together, let's sculpt the future of technology with finesse and distinction. Requirements We are looking for a sharp, self-driven Digital Marketing Manager who brings both strategic clarity and tactical agility. You’ll lead marketing initiatives that drive visibility, generate enterprise leads, and build our brand authority across key global markets. You'll work closely with sales, design, leadership, and our tech teams to build the marketing engine for a rapidly growing transformation business. Key Responsibilities B2B Demand Generation & Strategy Develop and execute integrated digital strategies tailored to enterprise buyers. Own the marketing funnel from awareness to MQL, collaborating with sales for pipeline acceleration. Define marketing KPIs and campaign goals aligned with business priorities. Account-Based Marketing (ABM) Design and manage 1:1 and 1: few ABM programs for strategic accounts. Drive alignment with BD and delivery teams for account-specific content, messaging, and outreach. Go-to-Market (GTM) and Campaign Management Plan and execute integrated GTM strategies for new services, capabilities, and solution offerings. Lead campaign management across regions and buyer personas, ensuring alignment with sales, product, and delivery teams. Develop campaign messaging, content assets, landing pages, and paid media strategy in sync with launch objectives. Track, report, and optimize campaign outcomes - leads, pipeline, and ROI. Social Media Marketing Own company's social media presence across LinkedIn, X, YouTube, and emerging channels. Build and manage a content calendar tailored to global audiences, trends, and leadership voices. Drive follower growth, engagement, and brand credibility through organic and paid initiatives. Collaborate with designers and content creators for sharp, platform-native creatives and campaigns. Brand Marketing and Execution Strengthen the brand voice through consistent storytelling across channels, campaigns, and experiences. Champion brand guidelines, tone, and positioning in every communication touchpoint. Collaborate on IP campaigns, whitepapers, event content, and leadership-driven narratives that reflect our domain authority. Partner with CXOs, delivery heads, and tech leads to translate complex ideas into clear, compelling brand stories. Content & Thought Leadership Work with internal experts to create compelling content that reflects our capabilities and business value. Plan whitepapers, case studies, and solution-led content for enterprise CXOs and decision-makers. Ensure all brand messaging reflects our tone: clean, confident, and client-focused. Team Collaboration & Leadership Bring leadership presence, while also being comfortable rolling up your sleeves when needed. What You’ll Need to Succeed 4–8 years of B2B marketing experience in a tech, SaaS, consulting, or digital transformation firm. Strong grasp of ABM, enterprise marketing, and long-cycle B2B sales funnels. Proven campaign management and performance optimization experience across organic, paid and owned media. Hands-on with platforms like Google Ads, LinkedIn Ads, HubSpot, Zoho Campaigns, Google Analytics. Ability to influence and execute - you're a strategic thinker and an individual game player. Excellent communication skills and cross-functional collaboration. Benefits Why Spearhead Technology? Be part of a fast-growing global company redefining enterprise transformation. Work with passionate innovators, problem-solvers, and global leaders. Freedom to lead, experiment, and scale your impact. Flexible work environment with performance-based growth and rewards.
Posted 1 day ago
50.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Your Team Responsibilities The Sustainability & Climate Engineering team is responsible for building and maintaining MSCI’s issuer-facing platform. This platform is strategically critical, serving as the interface between MSCI’s core data products and external issuers—supporting key datasets, with further expansions planned. The team ensures high availability, compliance with regulatory obligations and a seamless user experience through robust backend services and scalable architecture. Your Key Responsibilities Design, develop, and maintain scalable backend services using Java (Reactive programming) as part of the Data Provisioning application. Build and enhance event-driven flows using Azure Service Bus, and manage data workflows with Azure Cosmos DB. Collaborate closely with Product, QA, DevOps, and Frontend teams to implement new features, improve existing functionality, and ensure high system performance and reliability. Participate in system design discussions, code reviews, and continuous improvement efforts. Support regulatory and compliance features by ensuring traceability, data correctness, and issuer communication mechanisms. Ensure smooth deployment and monitoring of services on Azure App Services in collaboration with cloud infrastructure teams. Your Skills And Experience That Will Help You Excel Strong experience with Java, particularly with Spring Boot and Reactive Java (e.g., Project Reactor, WebFlux) or similar reactive programming frameworks. Hands-on experience with Azure cloud services, especially Cosmos DB , Azure Service Bus, and deployment via Azure App Services. Experience in building and maintaining microservices architectures and asynchronous processing flows. Familiarity with Delta Tables, Spark, or Databricks is a plus. Strong problem-solving skills and ability to work in a collaborative, agile environment. Excellent communication skills and ability to partner effectively with cross-functional stakeholders. About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 1 day ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
About Firstsource Firstsource is a specialized global business process management partner. We provide transformational solutions and services spanning the customer lifecycle across Healthcare, Banking and Financial Services, Communications, Media and Technology, and other diverse industries.With an established presence in the US, the UK, India, Mexico, Australia, and the Philippines, we act as a trusted growth partner for leading global brands, including several Fortune 500 and FTSE 100 companies. Key Responsibilities Perform data analysis to uncover patterns, trends, and insights to support decision-making. Build, validate, and optimize machine learning models for business use cases in EdTech, Healthcare, BFS and Media. Develop scalable ETL pipelines to preprocess and manage large datasets. Communicate actionable insights through visualizations and reports to stakeholders. Collaborate with engineering teams to implement and deploy models in production (good to have). Core Skills Data Analysis: Expert in Python (Pandas, NumPy), SQL, R, and exploratory data analysis (EDA). Machine Learning: Skilled in Scikit-learn, TensorFlow, PyTorch, and XGBoost for predictive modeling. Statistics: Strong understanding of regression, classification, hypothesis testing, and time-series analysis. Visualization: Proficient in Tableau, Power BI, Matplotlib, and Seaborn. ML Engineering (Good to Have): Experience with model deployment using AWS SageMaker, GCP AI, or Docker. Big Data (Good to Have): Familiarity with Spark, Hadoop, and distributed computing frameworks. ⚠️ Disclaimer: Firstsource follows a fair, transparent, and merit-based hiring process. We never ask for money at any stage. Beware of fraudulent offers and always verify through our official channels or @firstsource.com email addresses.
Posted 1 day ago
1.0 - 3.0 years
7 - 11 Lacs
Noida
Work from Office
We are looking for a Solution Analyst to join our Content Team in Noida. This is an amazing opportunity to work on Product. The team consists of 4 and is reporting to the Senior Manager Michael Duell. We have great skill set in data analytics, xml, and intellectual property and we would love to speak with you if you have skills in XML, Intellectual property, data analytics, and requirement writing. About You experience, education, skills, and accomplishments Preferred skills: Basic knowledge in Python, XML, Intellectual property Patents, and SQL. Familiar with JIRA 1 - 3 years of relevant professional experience in either Intellectual property or solution analysis. Bachelors degree required; Masters degree preferred, preferably in a quantitative or technical field It would be great if you also had Good analytical, verbal and written interpersonal and communication skills as the solutions Analysis is responsible for analyzing identified business needs and problems to elaborate requirements for innovative and effective solutions for internal and external customers. What will you be doing in this role? Assist stakeholders in the feature prioritization and development of scope for a quarter or release. Assess and evaluate existing solutions to determine their compatibility with new or evolving business needs. Collaborate with delivery teams to ensure the proposed solutions are technically feasible and sustainable. Analyse and interpret data to provide insights that inform decision-making. Create and maintain detailed Agile-oriented artifacts for proposed solutions, including functional and non-functional specifications, system flows, and data models throughout the development lifecycle Collaborate with cross-functional teams, including product owners, developers, QA, UX, architecture and delivery managers, to communicate and clarify requirements. Communicate technical concepts to non-technical stakeholders in a clear and understandable manner. Define acceptance criteria and work with QA teams to ensure that solutions meet the specified requirements and are free of defects. Participate in testing processes to validate the functionality and performance of solutions. Participate in defect troubleshooting and resolution. Product you will be developing We acquire Patent data from patent offices across the globe and convert the data into a common format and supply to the external and internal customers. The solution analyst will perform primary analysis on the data, identify the patterns, and provide instructions to the data engineers to create rules for converting specific data format into the Clarivate specific common format. Also, the solution analyst will involve in signing off the common format and workflow development for the sub-systems involved in delivering data to the customers. About the Team Content team includes dynamic group of professionals dedicated to advancing the field of intellectual property and they are experts in XML technology, which is crucial for managing and processing vast amounts of patent data. Primary, we support Derwent Innovation a patent research platform of Clarivate. Also, we serve multiple patent offices and tech giants across the globe, wherein we supply consolidated patent data. We are collaborating with data acquisition team, data engineering team, and distribution team. The major data format that we are handling is XML. We do have JSON data.
Posted 1 day ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Talworx Is hiring For Transformation engineering Mandatory Bachelor's degree in computer science, Information Systems or another applicable field is preferred 5+ years of experience in Application development, deployment, and support Experience working across Java, JEE, JSP, Spring, Spring Boot (Microservices), Spring JPA, REST, JSON, Junit, React, Python, Javascript, HTML, and XML 3+ years of experience in a Platform/ Application Engineering role in support of on-prem and Cloud based deployments (Azure preferred) Good to Have: 3+ years of experience with Platform / Application Administration Extensive experience with software deployments on Linux and Windows systems Experience working on Spark, Docker, Containers, Kubernetes, Microservices, Data Analytics, Visualization Tools, & GIT Experience building and supporting modern AI technologies: Azure Open AI and LLM Infrastructure / Applications Experience deploying and maintaining applications and infrastructure via configuration management software (Ansible, Terraform) using IaC best practices Extensive scripting skills (e.g., bash, Python) Experience using GitHub to manage application and Infrastructure deployment lifecycles in structured CI/CD environment Experience working in structured ITSM change management environment Knowledge of the configuration of monitoring solutions and the creation of dashboards (Splunk, Wily, Prometheus, Grafana, Dynatrace, Azure Monitor)
Posted 1 day ago
5.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Talworx Is hiring For Transformation engineering Mandatory Bachelor's degree in computer science, Information Systems or another applicable field is preferred 5+ years of experience in Application development, deployment, and support Experience working across Java, JEE, JSP, Spring, Spring Boot (Microservices), Spring JPA, REST, JSON, Junit, React, Python, Javascript, HTML, and XML 3+ years of experience in a Platform/ Application Engineering role in support of on-prem and Cloud based deployments (Azure preferred) Good to Have: 3+ years of experience with Platform / Application Administration Extensive experience with software deployments on Linux and Windows systems Experience working on Spark, Docker, Containers, Kubernetes, Microservices, Data Analytics, Visualization Tools, & GIT Experience building and supporting modern AI technologies: Azure Open AI and LLM Infrastructure / Applications Experience deploying and maintaining applications and infrastructure via configuration management software (Ansible, Terraform) using IaC best practices Extensive scripting skills (e.g., bash, Python) Experience using GitHub to manage application and Infrastructure deployment lifecycles in structured CI/CD environment Experience working in structured ITSM change management environment Knowledge of the configuration of monitoring solutions and the creation of dashboards (Splunk, Wily, Prometheus, Grafana, Dynatrace, Azure Monitor)
Posted 1 day ago
4.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Responsibilities Design, develop, and implement robust Big Data solutions using technologies such as Hadoop, Spark, and NoSQL databases Build and maintain scalable data pipelines for effective data ingestion, transformation, and analysis Collaborate with data scientists, analysts, and cross-functional teams to understand business requirements and translate them into technical solutions Ensure data quality and integrity through effective validation, monitoring, and troubleshooting techniques Optimize data processing workflows for maximum performance and efficiency Stay up-to-date with evolving Big Data technologies and methodologies to enhance existing systems Implement best practices for data governance, security, and compliance Document technical designs, processes, and procedures to support knowledge sharing across teams Requirements Bachelor's or Master's degree in Computer Science, Engineering, or a related field 4+ years of experience as a Big Data Engineer or in a similar role Strong proficiency in Big Data technologies (Hadoop, Spark, Hive, Pig) and frameworks Extensive experience with programming languages such as Python, Scala, or Java Knowledge of data modeling and data warehousing concepts Familiarity with NoSQL databases like Cassandra or MongoDB Proficient in SQL for data querying and analysis Strong analytical and problem-solving skills Excellent communication and collaboration abilities Ability to work independently and effectively in a fast-paced environment Benefits Competitive salary and benefits package. Opportunity to work on cutting-edge technologies and solve complex challenges. Dynamic and collaborative work environment with opportunities for growth and career advancement. Regular training and professional development opportunities.
Posted 1 day ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Hi, Greetings from Peoplefy Infosolutions !!! We are hiring for one of our reputed MNC client based in Pune. We are looking for candidates with 10+ years of experience who is currently working as a Data Architect. Job Description: We are seeking a highly skilled and experienced Cloud Data Architect to design, implement, and manage scalable, secure, and efficient cloud-based data solutions. The ideal candidate will possess a strong combination of technical expertise, analytical skills, and the ability to collaborate effectively with cross-functional teams to translate business requirements into technical solutions. Key Responsibilities: Design and implement data architectures, including data pipelines, data lakes, and data warehouses, on cloud platforms. Develop and optimize data models (e.g., star schema, snowflake schema) to support business intelligence and analytics. Leverage big data technologies (e.g., Hadoop, Spark, Kafka) to process and analyze large-scale datasets. Manage and optimize relational and NoSQL databases for performance and scalability. Develop and maintain ETL/ELT workflows using tools like Apache NiFi, Talend, or Informatica. Ensure data security and compliance with regulations such as GDPR and CCPA. Automate infrastructure deployment using CI/CD pipelines and Infrastructure as Code (IaC) tools (e.g., Terraform, CloudFormation). Collaborate with analytics teams to integrate machine learning frameworks and visualization tools (e.g., Tableau, Power BI). Provide technical leadership and mentorship to team members. Interested candidates for above position kindly share your CVs on sneh.ne@peoplefy.com
Posted 1 day ago
2.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Company Description Wide Reach is a market leader providing marketing solutions. Their work includes graphics and illustration, website development, marketing, and promotion. Wide Reach seeks a talented and passionate Content Writer to join our growing team. Here, you'll be the wordsmith extraordinaire, crafting compelling content that informs, engages, and inspires. Role Description This is a full-time on-site role for a Content Writer located in Ahmedabad. The Content Writer will be responsible for web content writing, content strategy development, research, writing, and proofreading. Qualifications Web Content Writing, Writing, and Proofreading skills Experience in content strategy development and research Strong communication and writing skills Ability to work on-site in Ahmedabad Bachelor's degree in English, Journalism, Communications, or related field Why Wide Reach? * Be the content king/queen: We empower our writers to take ownership and infuse their unique voice into every project. * Content across the spectrum: Dive into a variety of digital projects, from blog posts and website copy to social media captions and email marketing campaigns. * Collaborative spirit: Work alongside a dynamic team of designers, strategists, and digital ninjas to bring campaigns to life. * Sharpen your skills: Refine your craft and stay ahead of the curve with ongoing learning and development initiatives. What you'll conquer: Content calendar creator: Strategically plan and organize monthly social media content calendars for diverse industry clients. Script & caption expert: Craft engaging, on-brand scripts for reels/videos and scroll-stopping captions that spark engagement. Trend tracker: Stay ahead of social media trends, formats, and viral hooks to keep content fresh and relevant. Brand voice guardian: Maintain a unique, consistent tone for each client, adapting seamlessly across industries. Multi-platform pro: Create tailored content for Instagram, Facebook, LinkedIn, YouTube Shorts, and more. Engagement driver: Write content that not only attracts but also encourages conversations and builds community. Feedback loop: Embrace client feedback and refine content to deliver the best-performing results. Are you the social media wordsmith we’ve been waiting for? ✅ Proven experience in social media content creation & strategy (2+ years preferred) ✅ A portfolio showcasing captions, scripts & content plans for different industries ✅ Strong understanding of social media platforms, trends & audience psychology ✅ Excellent grammar, storytelling, and creativity skills ✅ Ability to juggle multiple clients while meeting deadlines ✅ Passion for digital marketing with a knack for creative copywriting Bonus points for: ✨ Knowledge of social media analytics & SEO-friendly captions ✨ Experience with content scheduling tools (Buffer, Later, Hootsuite, etc.) ✨ Ability to ideate campaign concepts for festivals, launches & trending moments ✨ A sprinkle of creative flair with a keen eye for visual aesthetics Ready to join the Wide Reach family and create content that resonates across industries?
Posted 1 day ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Responsibilities Design, develop, and implement robust Big Data solutions using technologies such as Hadoop, Spark, and NoSQL databases Build and maintain scalable data pipelines for effective data ingestion, transformation, and analysis Collaborate with data scientists, analysts, and cross-functional teams to understand business requirements and translate them into technical solutions Ensure data quality and integrity through effective validation, monitoring, and troubleshooting techniques Optimize data processing workflows for maximum performance and efficiency Stay up-to-date with evolving Big Data technologies and methodologies to enhance existing systems Implement best practices for data governance, security, and compliance Document technical designs, processes, and procedures to support knowledge sharing across teams Requirements Bachelor's or Master's degree in Computer Science, Engineering, or a related field 4+ years of experience as a Big Data Engineer or in a similar role Strong proficiency in Big Data technologies (Hadoop, Spark, Hive, Pig) and frameworks Extensive experience with programming languages such as Python, Scala, or Java Knowledge of data modeling and data warehousing concepts Familiarity with NoSQL databases like Cassandra or MongoDB Proficient in SQL for data querying and analysis Strong analytical and problem-solving skills Excellent communication and collaboration abilities Ability to work independently and effectively in a fast-paced environment Benefits Competitive salary and benefits package. Opportunity to work on cutting-edge technologies and solve complex challenges. Dynamic and collaborative work environment with opportunities for growth and career advancement. Regular training and professional development opportunities.
Posted 1 day ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
SQL Lead - COE Location: Pune Experience: 6-10 Years We are looking to hire a Data Engineer with strong hands-on experience in SQL and PL/SQL. Required Skills & Abilities : 6+ Years experience in any Databases (MS-SQL /Oracle /Teradata/Netezza). 4+ Years of experience to manage team and client calls. Strong expertise in writing complex SQL queries, joins, subqueries, and analytical functions. Hands-on experience with stored procedures, functions, triggers, packages, and cursors. Understanding of database design principles, normalization, and partitioning Knowledge of ETL processes, data migration, and data transformation. Experience working with Oracle SQL Developer or other database tools. Ability to analyze requirements and translate them into efficient database solutions Familiarity with UNIX/Linux shell scripting for automation (preferred). Strong problem-solving and debugging skills. Good communication skills and ability to work in a collaborative environment. Key Responsibilities: Develop, optimize, and maintain PL/SQL stored procedures, functions, triggers, and packages. Write complex SQL queries, views, and indexes for data manipulation and reporting. Optimize SQL queries and database performance using indexing, partitioning, and query tuning techniques. Ensure data integrity and security by implementing constraints, validations, and best practices. Work with cross-functional teams to understand business requirements and design efficient database solutions. Troubleshoot database issues, debug PL/SQL code, and improve query performance. Implement ETL processes using SQL and PL/SQL. Perform database schema design, normalization, and optimization. Collaborate with DBA teams for database backup, recovery, and maintenance. Develop and maintain database documentation, coding standards, and best practices. Preferred Qualifications: Experience with cloud databases (GCP or any other cloud is a plus) is a plus. Exposure to big data technologies like Hadoop, Spark (optional)
Posted 1 day ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Responsibilities Architect and build real-time feature pipelines and model training workflows. Design, train, validate, and deploy RUL predictors, anomaly detectors, and LP-based grid solvers. Implement MLOps best practices: MLflow/Kubeflow pipelines, model registry, canary deployments. Collaborate on explainability modules (SHAP, LIME) and drift-detection alerts. Optimize GPU utilization; automate retraining schedules and performance monitoring. Skills & Experience 3+ years in ML engineering or data science roles; production-grade ML deployments. Expertise in time-series modeling: LSTM, GRU, isolation forest, ensemble methods. Strong Python skills; frameworks: TensorFlow, PyTorch, Scikit-Learn. Experience with Kubernetes-based MLOps: Kubeflow, KServe, MLflow. Proficiency tuning and deploying on NVIDIA GPUs (H100, H200). Nice-to-Have Domain experience in predictive maintenance, grid optimization, or IIoT. Familiar with feature-store design (TimescaleDB, Feast) and Spark-on-GPU. Knowledge of explainability libraries and regulatory compliance for AI.
Posted 1 day ago
3.0 years
7 - 10 Lacs
India
On-site
Role Objective Develop business relevant, high quality, scalable web applications. You will be part of a dynamic AdTech team solving big problems in the Media and Entertainment Sector. Roles & Responsibilities Application Design: Understand requirements from the user, create stories and be a part of the design team. Check designs, give regular feedback and ensure that the designs are as per user expectations. Architecture: Create scalable and robust system architecture. The design should be in line with the client infra. This could be on-prem or cloud (Azure, AWS or GCP). Development: You will be responsible for the development of the front-end and back-end. The application stack will comprise of (depending on the project) SQL, Django, Angular/React, HTML, CSS. Knowledge of GoLang and Big Data is a plus point. Deployment: Suggest and implement a deployment strategy that is scalable and cost-effective. Create a detailed resource architecture and get it approved. CI/CD deployment on IIS or Linux. Knowledge of dockers is a plus point. Maintenance: Maintaining development and production environments will be a key part of your job profile. This will also include trouble shooting, fixing bugs and suggesting ways for improving the application. Data Migration: In the case of database migration, you will be expected to suggest appropriate strategies and implementation plans. Documentation: Create a detailed document covering important aspects like HLD, Technical Diagram, Script Design, SOP etc. Client Interaction: You will be interacting with the client on a day-to-day basis and hence having good communication skills is a must. Requirements** Education -B. Tech (Comp. Sc, IT) or equivalent Experience- 3+ years of experience developing applications on Django, Angular/React, HTML and CSS Behavioural Skills- Clear and Assertive communication Ability to comprehend the business requirement Teamwork and collaboration Analytics thinking Time Management Strong Trouble shooting and problem-solving skills Technical Skills- Back-end and Front-end Technologies: Django, Angular/React, HTML and CSS. Cloud Technologies: AWS, GCP and Azure Big Data Technologies: Hadoop and Spark Containerized Deployment: Dockers and Kubernetes is a plus. Other: Understanding of Golang is a plus. Skills: sql,hadoop,kubernetes,spark,dockers,css,html,react.js,gcp,azure,python,golang,react,django,angular,aws
Posted 1 day ago
4.0 - 9.0 years
6 - 10 Lacs
Hyderabad, Bengaluru, Secunderabad
Work from Office
We are looking for a Senior Software Engineer to join our IMS Team in Bangalore. This is an amazing opportunity to work on Big Data technologies involved in content ingestion. The team consists of 10-12 engineers and is reporting to the Sr Manager. We have a great skill set in Spark, Java, Scala, Hive, Sql, XSLT, AWS EMR, S3, etc and we would love to speak with you if you have skills in the same. About You experience, education, skills, and accomplishments: Work Experience: Minimum 4 years experience in Big Data projects involved in content ingestion, curation, transformation Technical Skill: Spark, Python/Java, Scala, AWS EMR, S3, SQS, Hive, XSLT Education (bachelors degree in computer science, mechanical engineering, or related degree or at least 4 years of equivalent relevant experience) It would be great if you also had: Experience in analyzing and optimizing performance Exposure to any automation test frameworks Databricks Java Python programming What will you be doing in this role? Active role in planning, estimation, design, development and testing of large-scale, enterprise-wide initiatives to build or enhance a platform or custom applications that will be used for the acquisition, transformation, entity extraction, mining of content on behalf of business units across Clarivate Analytics Troubleshooting and addressing production issues within the given SLA Coordination with global representatives and teams
Posted 1 day ago
14.0 - 19.0 years
7 - 12 Lacs
Noida
Work from Office
We are looking for a Senior Manager-ML Ops to join our Technology team at Clarivate. You will get the opportunity to work in a cross-cultural work environment while working on the latest web technologies with an emphasis on user-centered design. About You (Skills & Experience Required) Bachelors or masters degree in computer science, Engineering, or a related field. Overall 14+ years of experience including DevOps, machine learning operations and data engineering domain Proven experience in managing and leading technical teams. Strong understanding of MLOps practices, tools, and frameworks. Proficiency in data pipelines, data cleaning, and feature engineering is essential for preparing data for model training. Knowledge of programming languages (Python, R), and version control systems (Git) is necessary for building and maintaining MLOps pipelines. Experience with MLOps-specific tools and platforms (e.g., Kubeflow, MLflow, Airflow) can streamline MLOps workflows. DevOps principles, including CI/CD pipelines, infrastructure as code (IaC), and monitoring is helpful for automating ML workflows. Familiarity with cloud platforms (AWS, GCP, Azure) and their associated services (e.g., compute, storage, ML platforms) is essential for deploying and scaling ML models. Familiarity with container orchestration tools like Kubernetes can help manage and scale ML workloads efficiently. It would be great if you also had, Experience with big data technologies (Hadoop, Spark). Knowledge of data governance and security practices. Familiarity with DevOps practices and tools. What will you be doing in this role? Data Science Model Deployment & Monitoring : Oversee the deployment of machine learning models into production environments. Ensure continuous monitoring and performance tuning of deployed models. Implement robust CI/CD pipelines for model updates and rollbacks. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Communicate project status, risks, and opportunities to stakeholders. Provide technical guidance and support to team members. Infrastructure & Automation : Design and manage scalable infrastructure for model training and deployment. Automate repetitive tasks to improve efficiency and reduce errors. Ensure the infrastructure meets security and compliance standards. Innovation & Improvement : Stay updated with the latest trends and technologies in MLOps. Identify opportunities for process improvements and implement them. Drive innovation within the team to enhance the MLOps capabilities.
Posted 1 day ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job description: Job Description Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ͏ Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ͏ Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ͏ Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ͏ Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Apache Spark . Experience: 5-8 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 day ago
5.0 - 10.0 years
5 - 9 Lacs
Noida
Work from Office
We are looking for a ML Ops Engineer to join our Technology team at Clarivate. You will get the opportunity to work in a cross-cultural work environment while working on the latest web technologies with an emphasis on user-centered design. About You (Skills & Experience Required) Bachelors or masters degree in computer science, Engineering, or a related field. 5+ years of experience in machine learning, data engineering, or software development. Good experience in building data pipelines, data cleaning, and feature engineering is essential for preparing data for model training. Knowledge of programming languages (Python, R), and version control systems (Git) is necessary for building and maintaining MLOps pipelines. Experience with MLOps-specific tools and platforms (e.g., Kubeflow, MLflow, Airflow) can streamline MLOps workflows. DevOps principles, including CI/CD pipelines, infrastructure as code (IaaC), and monitoring is helpful for automating ML workflows. Experience with atleast one of the cloud platforms (AWS, GCP, Azure) and their associated services (e.g., compute, storage, ML platforms) is essential for deploying and scaling ML models. Familiarity with container orchestration tools like Kubernetes can help manage and scale ML workloads efficiently. It would be great if you also had, Experience with big data technologies (Hadoop, Spark). Knowledge of data governance and security practices. Familiarity with DevOps practices and tools. What will you be doing in this role? Model Deployment & Monitoring : Oversee the deployment of machine learning models into production environments. Ensure continuous monitoring and performance tuning of deployed models. Implement robust CI/CD pipelines for model updates and rollbacks. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Communicate project status, risks, and opportunities to stakeholders. Provide technical guidance and support to team members. Infrastructure & Automation : Design and manage scalable infrastructure for model training and deployment. Automate repetitive tasks to improve efficiency and reduce errors. Ensure the infrastructure meets security and compliance standards. Innovation & Improvement : Stay updated with the latest trends and technologies in MLOps. Identify opportunities for process improvements and implement them. Drive innovation within the team to enhance the MLOps capabilities.
Posted 1 day ago
6.0 years
0 Lacs
Kolkata metropolitan area, West Bengal, India
On-site
Job Title: Senior Data Engineer – Databricks | Azure | PySpark Location: Kolkata | Bengaluru | Hyderabad Experience: 6+ Years Job Type: Full-Time Industry: Insurance Background is a Plus Job Summary: We are seeking a highly skilled and experienced Senior Associate with a strong background in Databricks, SQL, PySpark , and Microsoft Azure . The ideal candidate will have Insurance domain knowledge and be responsible for building and optimizing data pipelines and architectures, transforming raw data into usable formats, and collaborating with data scientists, analysts, and other stakeholders to support data-driven decision-making. Key Responsibilities: Design, build, and maintain scalable data pipelines using Apache Spark and PySpark in Databricks . Develop and manage data integration and ETL workflows in Azure Data Factory and Databricks. Optimize and troubleshoot Spark jobs and ensure efficient use of compute and memory resources. Write complex SQL queries to extract, transform, and analyze large datasets across multiple data sources. Implement data governance, quality, and security practices as per organizational standards. Collaborate with cross-functional teams to define data requirements and implement scalable solutions. Monitor, maintain, and optimize performance of data platforms hosted on Azure (e.g., ADLS, Azure Synapse, Azure SQL). Provide technical leadership and mentoring to junior team members. Required Skills & Qualifications: 6+ years of hands-on experience in Data Engineering . Strong expertise in Databricks including notebooks, clusters, Delta Lake, and job orchestration. Proficient in PySpark and distributed computing using Apache Spark. Expert-level knowledge of SQL for data analysis, transformation, and optimization. Extensive experience with Azure Cloud Services including Azure Data Factory, Azure Data Lake Storage (ADLS) , Azure Synapse , and Azure SQL DB . Experience with CI/CD pipelines , version control (e.g., Git), and DevOps best practices in Azure environment. Solid understanding of data modeling , data warehousing , and data governance practices. Strong analytical and problem-solving skills. Strong communication skills and end client facing experience Preferred Qualifications: Databricks Certified Developer or Microsoft Azure certification(s) (e.g., DP-203). Experience with real-time data streaming (e.g., Kafka, Azure Event Hubs) is a plus. Familiarity with scripting languages such as Python for automation. Experience working in Agile/Scrum environments. Insurance domain knowledge is a plus.
Posted 1 day ago
3.0 - 8.0 years
15 - 19 Lacs
Mumbai
Work from Office
Project description As an Engineer within the Public Markets Technology Department, you will play a pivotal role in developing and enhancing best-in-class applications that support global investment and co-investment strategies. This role involves close collaboration with both technology and business teams to modernize and evolve the technology landscape, enabling strategic and operational excellence. Responsibilities PMWB/CosmosoProvide operational support for PMs in London and Hongkong for Order Entry and Position Management. Currently in scope is EPM, SSG, and AE. Future expansion for LRA in London and Australia. EPM ValidatoroOnboard, Enhance and Maintain Fund Performance data for users in London and Hongkong EPM ExploreroNAV and Performance loading configuration changes. This requires Python code and config changes. BatchoCosmos and PMWB SOD SupportoEPM Risk DataoData Fabric CMF Pipelines expand monitoring to anticipate abnormal run times. SupportoPartner with the London ARS-FICC and FCT teams to make judgement calls and address failures independent of Toronto office Skills Must have University degree in Engineering or Computer Science preferred. 3+ years experience in software development. Strong knowledge and demonstrated experience with Python is a must Experience with Java is an asset. Experience with Relational and Non-Relational Databases is an asset. Demonstrated experience developing applications on AWS. AWS certification is preferred. Experience in the capital markets industry is a nice to have, including knowledge of various financial products and derivatives. Strong desire to learn how the business operates and how technology helps them achieve their goals. Must have an entrepreneurial attitude and can work in a fast-paced environment and manage competing priorities. Experience working in an Agile environment. Ability and willingness to adapt and contribute as needed to ensure the team meets its goals. Nice to have Public Markets Technology
Posted 1 day ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Overview Working at Atlassian Atlassians can choose where they work – whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. Your Future team At DevInfra Transformations, we enable teams to ship quality at velocity. Our focus is to “ Build an industry-leading internal platform. Validate our hypothesis with data. Ship features for internal and external usage”. This is accomplished by building complete end-to-end ML enabled development cycle and leveraging AI capabilities extensively. This is a unique opportunity to work in a collaborative environment, implement the cutting edge machine learning techniques, especially forecasting modelings, and tackle challenging and distinctive problems. Responsibilities Responsibilities As a Machine Learning engineer -2, you will work on the development and implementation of the cutting edge machine learning algorithms, training sophisticated models, collaborating with engineering and analytics teams, to build the AI functionality into Atlassian products and platforms. Your daily responsibilities will encompass a broad spectrum of tasks such as designing system and model features, conducting rigorous experimentation and model evaluations, and providing guidance to junior ML engineers. Your role is pivotal, stretching beyond these tasks, ensuring AI's transformative potential is realized across Atlassian. Qualifications On the first day, we'll expect you to have Bachelor's or Master's degree (preferably a Computer Science degree or equivalent experience) 5+ years of related industry experience in the data science domain Expertise in Python or Java with and the ability to write performant production-quality code, familiarity with SQL, knowledge of Spark and cloud data environments (e.g. AWS, Databricks) Experience building and scaling machine learning models in business applications using large amounts of data Ability to communicate and explain data science concepts to diverse audiences, craft a compelling story Focus on business practicality and the 80/20 rule; very high bar for output quality, but recognize the business benefit of "having something now" vs "perfection sometime in the future" Agile development mindset, appreciating the benefit of constant iteration and improvement Good to have Experience working in a consumer or B2C space for a SaaS product provider, or the enterprise/B2B space Experience in developing deep learning-based models and working on LLM-related applications Excelling in solving ambiguous and complex problems, being able to navigate through uncertain situations, breaking down complex challenges into manageable components and developing innovative solutions Our Perks & Benefits Atlassian offers a wide range of perks and benefits designed to support you, your family and to help you engage with your local community. Our offerings include health and wellbeing resources, paid volunteer days, and so much more. To learn more, visit go.atlassian.com/perksandbenefits . About Atlassian At Atlassian, we're motivated by a common goal: to unleash the potential of every team. Our software products help teams all over the planet and our solutions are designed for all types of work. Team collaboration through our tools makes what may be impossible alone, possible together. We believe that the unique contributions of all Atlassians create our success. To ensure that our products and culture continue to incorporate everyone's perspectives and experience, we never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. All your information will be kept confidential according to EEO guidelines. To provide you the best experience, we can support with accommodations or adjustments at any stage of the recruitment process. Simply inform our Recruitment team during your conversation with them. To learn more about our culture and hiring process, visit go.atlassian.com/crh .
Posted 1 day ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Us Zelis is modernizing the healthcare financial experience in the United States (U.S.) by providing a connected platform that bridges the gaps and aligns interests across payers, providers, and healthcare consumers. This platform serves more than 750 payers, including the top 5 health plans, BCBS insurers, regional health plans, TPAs and self-insured employers, and millions of healthcare providers and consumers in the U.S. Zelis sees across the system to identify, optimize, and solve problems holistically with technology built by healthcare experts—driving real, measurable results for clients. Why We Do What We Do In the U.S., consumers, payers, and providers face significant challenges throughout the healthcare financial journey. Zelis helps streamline the process by offering solutions that improve transparency, efficiency, and communication among all parties involved. By addressing the obstacles that patients face in accessing care, navigating the intricacies of insurance claims, and the logistical challenges healthcare providers encounter with processing payments, Zelis aims to create a more seamless and effective healthcare financial system. Zelis India plays a crucial role in this mission by supporting various initiatives that enhance the healthcare financial experience. The local team contributes to the development and implementation of innovative solutions, ensuring that technology and processes are optimized for efficiency and effectiveness. Beyond operational expertise, Zelis India cultivates a collaborative work culture, leadership development, and global exposure, creating a dynamic environment for professional growth. With hybrid work flexibility, comprehensive healthcare benefits, financial wellness programs, and cultural celebrations, we foster a holistic workplace experience. Additionally, the team plays a vital role in maintaining high standards of service delivery and contributes to Zelis’ award-winning culture. Position Overview About Zelis Zelis is a leading payments company in healthcare, guiding, pricing, explaining, and paying for care on behalf of insurers and their members. We align the interests of payers, providers, and consumers to deliver a better financial experience and more affordable, transparent care for all. Partnering with 700+ payers, supporting 4 million+ providers and 100 million members across the healthcare industry. About ZDI Zelis Data Intelligence (ZDI) is a centralized data team that partners across Zelis business units to unlock the value of data through intelligence and AI solutions. Our mission is to transform data into a strategic and competitive asset by fostering collaboration and innovation. Enable the democratization and productization of data assets to drive insights and decision-making. Develop new data and product capabilities through advanced analytics and AI-driven solutions. Collaborate closely with business units and enterprise functions to maximize the impact of data. Leverage intelligence solutions to unlock efficiency, transparency, and value across the organization. Key Responsibilities Product Expertise & Collaboration Become an expert in product areas, acting as the go-to person for stakeholders before engaging with technical data and data engineering teams. Lead the creation of clear user stories and tasks in collaboration with Engineering teams to track ongoing and upcoming work. Design, build, and own repeatable processes for implementing projects. Collaborate with software engineers, data engineers, data scientists, and other product teams to scope new or refine existing product features and data capabilities that increase business value, adoption, and user engagement. Understand how the product area aligns with the wider company roadmap and educate internal teams on the organization’s vision. Requirements Management & Communication Ensure consistent updates of tickets and timelines, following up with technical teams on status and roadblocks. Draft clear and concise business requirements and technical product documentation. Understand the Zelis healthcare ecosystem (e.g., claims, payments, provider and member data) and educate the company on requirements and guidelines for accessing, sharing, and requesting information to inform advanced analytics, feature enhancements, and new product innovation. Communicate with technical audiences to identify requirements, gaps, and barriers, translating needs into product features. Track key performance indicators to evaluate product performance. Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. 4+ years technical experience business analyst, data analyst, technical product, engineering, etc. experience with demonstrated ability to deliver alongside technical teams 4+ years of direct experience with Agile methodologies & frameworks and product tools such as Jira and Confluence to author user stories, acceptance criteria etc. Technical depth that enables you to collaborate with software engineers, data engineers and data scientists and drive technical discussions about design of data visualizations, data models, ETLs, deployment of data infrastructure Understanding of Data Management, Data Engineering, API development, Cloud Engineering, Advanced Analytics, Data Science, or Product Analytics concepts or other data/product tools such as SQL, Python, R, Spark, AWS, Azure, Airflow, Snowflake, and PowerBI Preferred Qualifications Strong communication skills, with clear verbal communication as well as explicit and mindful written communication skills to work with technical teams B2B or B2C experience helpful Familiarity with the US healthcare system Hands-on experience with Snowflake or other cloud platforms, including data pipeline architecture, cloud-based systems, BI/Analytics, and deploying data infrastructure solutions.
Posted 1 day ago
5.0 - 6.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Job Title: Data Engineer Website: https://www.issc.co.in Location: Udyog Vihar, Phase-V, Gurugram Job type: Full-Time Employment Type: 5 Days Working( No Hybrid/Work from Home) Compensation: As per Industry Standards Address: Udyog Vihar, Phase – V, Gurgaon Company Overview: With the world constantly and rapidly changing, the future will be full of realigned priorities. You are keen to strengthen your firms profitability and reputation by retaining existing clients and winning more in the market. We at ISSC have the right resources to ensure your team has access to right skills to deliver effective assurance and IT Advisory whilst you build and scale your team onshore to meet the client’s broader assurance needs. By offshoring part of the routine and less complex auditing work to ISSC, you will free up capacity in your own organization which can be utilized in areas which requires more face time with your clients including your quest to win new clients. Having the right team on your side at ISSC will be vital as you follow your exciting growth plans and it is in this role your ISSC team stands apart. We offer a compelling case in becoming your key partner for the future. Position Summary: We are seeking a skilled and detail-oriented Data Engineer to join our team. As a Data Engineer, you will be responsible for developing and optimizing data pipelines, managing data architecture, and ensuring the data is easily accessible, reliable, and secure. You will work closely with data scientists, analysts, and other stakeholders to gather requirements and deliver data solutions that support business intelligence and analytics initiatives. The ideal candidate should possess strong data manipulation skills, a keen eye for detail, and the ability to work with diverse datasets. This role plays a crucial part in ensuring the quality and integrity of our data, enabling informed decision-making across the organization. Responsibilities: Data Pipeline Development: Design, develop, and maintain scalable data pipelines to process, transform, and move large datasets across multiple platforms. Ensure data integrity, reliability, and quality across all pipelines. Data Architecture and Infrastructure: Architect and manage the data infrastructure, including databases, warehouses, and data lakes. Implement solutions to optimize storage and retrieval of both structured and unstructured data. Data Integration and Management: Integrate data from various sources (e.g., APIs, databases, third-party providers) into a unified system. Manage ETL (Extract, Transform, Load) processes to clean, enrich, and make data ready for analysis. Data Security and Compliance: Ensure data governance, privacy, and compliance with security standards (e.g., GDPR, HIPAA). Implement robust access controls and encryption protocols. Collaboration: Work closely with data scientists, analysts, and business stakeholders to gather requirements and deliver high-performance data solutions. Collaborate with DevOps and software engineering teams to deploy and maintain the data infrastructure in a cloud or on-premises environment. Performance Tuning: Monitor and improve the performance of databases and data pipelines to ensure low-latency data availability. Troubleshoot and resolve issues in the data infrastructure. Documentation and Best Practices: Maintain detailed documentation of data pipelines, architecture, and processes. Follow industry best practices for data engineering, including version control and continuous integration. Skills/ Requirements: Technical Skills: Proficiency in programming languages such as Python, or SQL. Good experience with big data technologies like Apache Spark, Hadoop, Kafka, Flink, etc. Experience with cloud data platforms (AWS, Azure). Familiarity with databases (SQL and NoSQL), data warehousing solutions (e.g., Snowflake, Redshift), and ETL tools (e.g., Airflow, Talend). Data Modeling and Database Design: Expertise in designing data models and relational database schemas. Problem-Solving: Strong analytical and problem-solving skills, with the ability to handle complex data issues. Version Control and Automation: Experience with CI/CD pipelines and version control tools like Git. Professional Qualifications: • 5 - 6 years of relevant experience. • BTech, Statistics, Information Technology, or a related field. Other Benefits: • Free Meal • 1 Happy Hour Every week • 3 Offsite in a year • 1 Spa every week
Posted 1 day ago
2.0 years
12 - 28 Lacs
Coimbatore, Tamil Nadu, India
On-site
Experience: 3 to 10 Location : Coimbatore Notice Period: Immediate Joiners are Preferred. Note: Minimum 2 years experience into core Gen AI 𝗞𝗲𝘆 𝗥𝗲𝘀𝗽𝗼𝗻𝘀𝗶𝗯𝗶𝗹𝗶𝘁𝗶𝗲𝘀: Design, develop, and fine-tune Large Language Models (LLMs) for various in-house applications. Implement and optimize Retrieval-Augmented Generation (RAG) techniques to enhance AI response quality. Develop and deploy Agentic AI systems capable of autonomous decision-making and task execution. Build and manage data pipelines for processing, transforming, and feeding structured/unstructured data into AI models. Ensure scalability, performance, and security of AI-driven solutions in production environments. Collaborate with cross-functional teams, including data engineers, software developers, and product managers. Conduct experiments and evaluations to improve AI system accuracy and efficiency. Stay updated with the latest advancements in AI/ML research, open-source models, and industry best practices. 𝗥𝗲𝗾𝘂𝗶𝗿𝗲𝗱 𝗦𝗸𝗶𝗹𝗹𝘀 & 𝗤𝘂𝗮𝗹𝗶𝗳𝗶𝗰𝗮𝘁𝗶𝗼𝗻𝘀 Strong experience in LLM fine-tuning using frameworks like Hugging Face, DeepSpeed, or LoRA/PEFT. Hands-on experience with RAG architectures, including vector databases (e.g., Pinecone, ChromaDB, Weaviate, OpenSearch, FAISS). Experience in building AI agents using LangChain, LangGraph, CrewAI, AutoGPT, or similar frameworks. Proficiency in Python and deep learning frameworks like PyTorch or TensorFlow. Experience in Python web frameworks such as FastAPI, Django, or Flask. Experience in designing and managing data pipelines using tools like Apache Airflow, Kafka, or Spark. Knowledge of cloud platforms (AWS/GCP/Azure) and containerization technologies (Docker, Kubernetes). Familiarity with LLM APIs (OpenAI, Anthropic, Mistral, Cohere, Llama, etc.) and their integration in applications. Strong understanding of vector search, embedding models, and hybrid retrieval techniques. Experience with optimizing inference and serving AI models in real-time production systems. 𝗡𝗶𝗰𝗲-𝘁𝗼-𝗛𝗮𝘃𝗲 𝗦𝗸𝗶𝗹𝗹𝘀 Experience with multi-modal AI (text, image, audio). Familiarity with privacy-preserving AI techniques and responsible AI frameworks. Understanding of MLOps best practices, including model versioning, monitoring, and deployment automation. Skills: pytorch,rag architectures,opensearch,weaviate,docker,llm fine-tuning,chromadb,apache airflow,lora,python,hybrid retrieval techniques,django,gcp,crewai,opean ai,hugging face,gen ai,pinecone,faiss,aws,autogpt,embedding models,flask,fastapi,llm apis,deepspeed,vector search,peft,langchain,azure,spark,kubernetes,ai gen,tensorflow,real-time production systems,langgraph,kafka
Posted 1 day ago
4.0 - 7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Data Engineer Location: Hyderabad, India Employment Type: Full-time Experience : 4 to 7 Years About NationsBenefits: At NationsBenefits, we are leading the transformation of the insurance industry by developing innovative benefits management solutions. We focus on modernizing complex back-office systems to create scalable, secure, and high-performing platforms that streamline operations for our clients. As part of our strategic growth, we are focused on platform modernization — transitioning legacy systems to modern, cloud-native architectures that support the scalability, reliability, and high performance of core back-office functions in the insurance domain. Position Overview: We are seeking a self-driven Data Engineer with 4–7 years of experience to build and optimize scalable ETL/ELT pipelines using Azure Databricks, PySpark, and Delta Lake. The role involves working across scrum teams to develop data solutions, ensure data governance with Unity Catalog, and support real-time and batch processing. Strong problem-solving skills, T-SQL expertise, and hands-on experience with Azure cloud tools are essential. Healthcare domain knowledge is a plus. Job Description: Work with different scrum teams to develop all the quality database programming requirements of the sprint. Experience in Azure cloud platforms like Advanced Python Programming, Databricks , Azure SQL , Data factory (ADF), Data Lake, Data storage, SSIS. Create and deploy scalable ETL/ELT pipelines with Azure Databricks by utilizing PySpark and SQL . Create Delta Lake tables with ACID transactions and schema evolution to support real-time and batch processing. Experience in Unity Catalog for centralized data governance, access control, and data lineage tracking. Independently analyse, solve, and correct issues in real time, providing problem resolution end-to-end. Develop unit tests to be able to test them automatically. Use SOLID development principles to maintain data integrity and cohesiveness. Interact with product owner and business representatives to determine and satisfy needs. Sense of ownership and pride in your performance and its impact on company’s success. Critical thinker and problem-solving skills. Team player. Good time-management skills. Great interpersonal and communication skills. Mandatory Qualifications: 4-7 years of experience as a Data Engineer. Self-driven with minimal supervision. Proven experience with T-SQL programming, Azure Databricks, Spark (PySpark/Scala), Delta Lake, Unity Catalog, ADLS Gen2 Microsoft TFS, Visual Studio, Devops exposure. Experience with cloud platforms such as Azure or any. Analytical, problem-solving mindset. Preferred Qualifications HealthCare domain knowledge
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France