Jobs
Interviews

15 Data Optimization Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

delhi

On-site

The Supplier Optimization Manager position at TravelBullz requires a highly analytical and commercially minded individual to lead API-driven supplier performance and commercial efficiency strategies. Your role will involve vendor analysis, real-time data optimization, and destination performance mapping to ensure TravelBullz maximizes the value from every API connection. Your key responsibilities will include monitoring and optimizing supplier performance through real-time data analysis, focusing on Look-to-Book ratios, availability health, conversion rates, and error mapping. You will also be responsible for conducting API buying optimization by reviewing competitive pricing, content quality, and connectivity health. Furthermore, you will manage and continuously refine supplier mapping to ensure that the best-performing suppliers serve the highest-demand routes and destinations. Additionally, you will analyze best-selling destinations and inventory gaps to reallocate traffic and improve coverage. Collaboration with product, tech, and commercial teams is essential to implement supplier rule engines, fallback logic, and prioritization strategies. You will also lead supplier QBRs with actionable insights based on KPIs like fill-rate, uptime, and quote speed, and support negotiation of commercial terms aligned with performance metrics and strategic goals. To qualify for this role, you should have a Bachelor's degree in Business, Supply Chain, Data Analytics, or a related field (Masters preferred) and at least 2+ years of experience in travel tech, OTA, wholesaler, or API-based B2B optimization. A strong understanding of supplier API structures, cache/feed performance, and commercial logic is required. Experience in building or optimizing supplier distribution matrices, destination-supplier matching, or fallback strategies is also beneficial. Proficiency in Excel, BI tools (e.g., Power BI, Elastic), and fluency in English are essential. Experience with rate caching, deduplication engines, or travel data platforms is a bonus. In return, TravelBullz offers a competitive salary with performance bonus, medical benefits, flexible working hours, regional exposure, cross-functional collaboration, and fast-track leadership potential in a growing digital enterprise. If you are interested in this opportunity, please share your resumes at hr@travelbullz.com with the subject line "Supplier Optimization Manager.",

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

maharashtra

On-site

You will be heading the Data Analytics and AI team within the Enterprise Technology Group with a primary purpose of leading and developing a robust team focused on driving the company towards a data-first and data-driven organization. Your role will involve delivering and maintaining smart data-driven solutions essential for growth and operational efficiency. Accelerating the adoption and implementation of AI solutions to enhance innovation and competitive advantage will be a key aspect. Managing all aspects of new data analytics and AI projects from inception to delivery, including day-to-day operations, will be crucial, with a strong emphasis on assessing business value, ROI, and cost reduction. The ideal candidate should hold a Bachelor's or Master's degree in Computer Science, Math, Quantitative Methods, or Information Systems, with over 10 years of overall technology experience, including substantial experience in data analytics and AI. Proficiency in programming languages such as Python, R, or similar, as well as a deep understanding of algorithms like linear regression, logistic regression, decision trees, random forest, boosting algorithms (e.g., XGBoost), k-means, hierarchical clustering, and principal component analysis is preferred. Familiarity with data science concepts, statistical and mathematical methods, and various Data & AI technologies/tools is essential. Located in Mumbai, this is a full-time office-based position within the Enterprise Technology Group, reporting to the Head of the same group. The key responsibilities will include leading and developing a strong data analytics and AI team, establishing best practices for data analytics and responsible AI initiatives, managing all stages of AI and LLM projects, collaborating with stakeholders to gather requirements, and implementing advanced models and algorithms to solve complex business problems. Staying updated with the latest advancements in AI and LLM technologies will also be a critical aspect of the role. Key behavioral strengths required for this position include excellent analytical, problem-solving, and decision-making skills, proven leadership abilities, effective communication and collaboration skills, and the capacity to work with diverse stakeholders. Joining CulverMax Entertainment Pvt Ltd will offer you the opportunity to work with some of India's leading entertainment channels and OTT platforms, alongside a progressive and inclusive work culture that celebrates diversity and innovation. As a part of the team, you will contribute to creating exceptional content and experiences while being recognized as an Employer of Choice in various prestigious awards and accolades.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

ahmedabad, gujarat

On-site

You are a creative and data-driven Content Strategist responsible for leading the planning, development, and execution of content across various platforms. Your role involves aligning content with brand goals, managing content calendars, and ensuring consistency in tone and messaging to drive engagement, reach, and conversions. Your key responsibilities will include developing and managing a comprehensive content strategy for web, social media, blogs, email, and other digital channels. This will involve conducting audience research, competitor analysis, and SEO audits to inform content planning. Collaboration with design, marketing, and product teams is essential to deliver cohesive messaging. You will also create and manage content calendars that align with campaigns, launches, and business goals. As a Content Strategist, you will supervise writers, freelancers, and content creators to ensure content quality and consistency. Tracking performance using tools like Google Analytics, SEMrush, or HubSpot will be crucial, enabling you to optimize content based on data-driven insights. Additionally, staying updated on industry trends and emerging content formats is necessary to enhance content strategy effectiveness. If you are interested in this role, please share your CV at info@xcelhrsolutions.com.,

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

You should have at least 4 years of experience in Power BI and Tableau, specializing in data modeling for reporting and data warehouses. Your expertise should include a thorough understanding of the BI process and excellent communication skills. Your responsibilities will involve Tableau Server Management, including installation, configuration, and administration to maintain consistent availability and performance. You must showcase your ability to design, develop, and optimize data models for large datasets and complex reporting requirements. Strong analytical and debugging skills are essential to identify, analyze, and resolve issues within Power BI reports, SQL code, and data for accurate and efficient performance. Proficiency in DAX and Power Query, along with advanced knowledge of data modeling concepts, is required. Additionally, you should possess strong SQL skills for querying, troubleshooting, and data manipulation. Security implementation is crucial, as you will be responsible for managing user permissions and roles to ensure data security and compliance with organizational policies. A good understanding of ETL processes and in-depth knowledge of Power BI Service, Tableau Server, and Desktop are expected. Your familiarity with workspaces, datasets, dataflows, and security configurations will be beneficial in fulfilling your role effectively.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As the leader of the data analytics and AI team, you will play a crucial role in driving our organization towards a data-first and data-driven approach. Your primary responsibility will be to deliver and maintain smart data-driven solutions that are essential for our growth and daily operations. By accelerating the adoption of AI solutions, you will enhance efficiency, foster innovation, and gain a competitive advantage for the company. Your role will involve overseeing all aspects of new data and AI projects, from their inception to delivery, with a strong emphasis on assessing business value, ROI, and cost reduction. Your key responsibilities will include leading and developing a robust data analytics and AI team, establishing best practices and methodologies for data analytics and responsible AI initiatives, managing and overseeing AI and LLM projects, building relationships with business process owners, collaborating with internal and external stakeholders, developing advanced models and algorithms, addressing data mining performance issues, and staying updated on the latest advancements in AI and LLM technologies to improve project outcomes. Preferred skills for this role include proficiency in programming languages such as Python, R, or similar, understanding of algorithms like linear regression, logistic regression, decision trees, random forest, boosting algorithms (e.g., XGBoost), clustering techniques, and principal component analysis. Additionally, familiarity with data science concepts including statistics, probability, data transformation, visualization, storytelling, and deep learning, as well as expertise in statistical and mathematical methods and data & AI technologies/tools like ML Ops, ETL, data lakes, graph databases, NLK, and data optimization will be beneficial for success in this position.,

Posted 4 weeks ago

Apply

2.0 - 6.0 years

1 - 2 Lacs

Chennai

Work from Office

Skilled in tool selection,insert grades, and cutting data optimization Strong in G- code,M-Code & CNC programming Proficient in 2 axis to multi axis and sliding head CNC lathes Knowledge in lean, Kaizen & 5s practices Create programs(Fanuc, Siemens)

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

As a Data Engineering Manager at GoKwik, you will have the exciting opportunity to lead a team of data engineers and collaborate closely with product managers, data scientists, business intelligence teams, and SDEs to design and implement data-driven strategies across the organization. You will be responsible for designing the overall data architecture that drives valuable insights for the company. Your key responsibilities will include leading and guiding the data engineering team in developing optimal data strategies according to business needs, identifying and implementing process improvements to enhance data models, architectures, pipelines, and applications, ensuring data optimization processes, managing data governance, security, and analysis, as well as hiring and mentoring top talent within the team. Additionally, you will play a crucial role in managing data delivery through high-performing dashboards, visualizations, and reports, ensuring data quality and security across various product verticals, designing and launching new data models and pipelines, acting as a project manager for data projects, and fostering a data-driven culture within the team. To excel in this role, you should possess a Bachelor's/Master's degree in Computer Science, Mathematics, or relevant field, along with at least 7 years of experience in Data Engineering. Strong project management skills, proficiency in SQL and relational databases, experience in building data pipelines and architectures, familiarity with data transformation processes, and working knowledge of AWS cloud services are essential requirements for this role. We are seeking individuals who are independent, resourceful, analytical, and adept at problem-solving, with the ability to thrive in a fast-paced and dynamic environment. Excellent communication skills, both verbal and written, are crucial for effective collaboration within cross-functional teams. If you are looking to be part of a high-growth startup that values innovation, talent, and customer-centricity, and if you are passionate about tackling challenging problems and making a significant impact within an entrepreneurial setting, we invite you to join our team at GoKwik!,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

kolkata, west bengal

On-site

You are a Data Engineer with 2 to 4 years of experience in Python and PL/SQL. Your primary responsibility is to design, develop, and maintain data pipelines, ETL processes, and database solutions. You will be working on ETL Development & Data Processing, where you will develop, optimize, and maintain ETL pipelines for data ingestion, transformation, and integration. You will handle structured and semi-structured data from various sources and implement data cleansing, validation, and enrichment processes using Python and PL/SQL. In Database Development & Optimization, you will write, debug, and optimize complex SQL queries, stored procedures, functions, and triggers in PL/SQL. Additionally, you will design and maintain database schemas, indexing strategies, and partitioning for performance optimization, ensuring data consistency, quality, and governance across all data sources. Your role also involves Data Engineering & Automation, where you will automate data workflows using Python scripts and scheduling tools like Airflow, Cron, or DBMS_JOB. You will optimize query performance, troubleshoot database-related performance issues, and monitor data pipelines for failures while implementing alerting mechanisms. Collaboration & Documentation are crucial aspects of your job. You will closely collaborate with Data Analysts, Architects, and Business teams to understand data requirements. Documenting ETL processes, database schemas, and data flow diagrams will be part of your responsibilities. You will also participate in code reviews, testing, and performance tuning activities. Your Technical Skills should include strong experience in Python for data processing (Pandas, NumPy, PySpark), expertise in PL/SQL, hands-on experience with ETL tools, and knowledge of relational and non-relational databases. Exposure to Cloud & Big Data technologies like AWS/GCP/Azure, Spark, or Snowflake will be advantageous. Soft Skills such as problem-solving, effective communication, teamwork, and ability to manage tasks independently are essential for this role. This is a Full-time, Permanent position with a Day shift schedule and an in-person work location.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

The ideal candidate for the Azure Data Engineer position will have 4-6 years of experience in designing, implementing, and maintaining data solutions on Microsoft Azure. As an Azure Data Engineer at our organization, you will be responsible for designing efficient data architecture diagrams, developing and maintaining data models, and ensuring data integrity, quality, and security. You will also work on data processing, data integration, and building data pipelines to support various business needs. Your role will involve collaborating with product and project leaders to translate data needs into actionable projects, providing technical expertise on data warehousing and data modeling, as well as mentoring other developers to ensure compliance with company policies and best practices. You will be expected to maintain documentation, contribute to the company's knowledge database, and actively participate in team collaboration and problem-solving activities. We are looking for a candidate with a Bachelor's degree in Computer Science, Information Technology, or a related field, along with proven experience as a Data Engineer focusing on Microsoft Azure. Proficiency in SQL and experience with Azure data services such as Azure Data Factory, Azure SQL Database, Azure Databricks, and Azure Synapse Analytics is required. Strong understanding of data architecture, data modeling, data integration, ETL/ELT processes, and data security standards is essential. Excellent problem-solving, collaboration, and communication skills are also important for this role. As part of our team, you will have the opportunity to work on exciting projects across various industries like High-Tech, communication, media, healthcare, retail, and telecom. We offer a collaborative environment where you can expand your skills by working with a diverse team of talented individuals. GlobalLogic prioritizes work-life balance and provides professional development opportunities, excellent benefits, and fun perks for its employees. Join us at GlobalLogic, a leader in digital engineering, where we help brands design and build innovative products, platforms, and digital experiences for the modern world. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers worldwide, serving customers in industries such as automotive, communications, financial services, healthcare, manufacturing, media and entertainment, semiconductor, and technology.,

Posted 1 month ago

Apply

3.0 - 5.0 years

4 - 4 Lacs

Ahmedabad

Work from Office

Roles and Responsibilities: Supervise and guide a team of Lead Generation, Lead Acquisition, and Customer Experience. Define and track KPIs, KRAs, daily productivity, and conversion metrics. Oversee lead allocation, validation, deal pipeline movement, demo performance, and coordination with field sales teams. Monitor demos, closures, and ensure timely follow-ups on expiring subscriptions Implement and standardize processes across teams as per SOPs. Identify and resolve gaps in lead qualification, feedback, and follow-up strategies and work cross-functionally to fix them. Optimize usage of CRM and reporting tools, ensuring real-time data accuracy. Manage escalation flow from Customer Experience Executives and resolve high-priority client issues. Prepare weekly and monthly performance dashboards across teams (leads shared, demos arranged, closures, conversion %). Analyze feedback, churn, and deal-loss reasons to suggest actionable insights. Maintain detailed logs for calls, demos, meetings, and feedback quality. Collaborate with the Sales Operations Manager to execute growth initiatives.. Drive a performance-based culture with recognition, feedback, and consistent communication.

Posted 1 month ago

Apply

4.0 - 7.0 years

20 - 35 Lacs

Chennai

Remote

Role & responsibilities: Architect, deploy, and manage scalable cloud environments (AWS/GCP/DO) to support distributed data processing solutions to handle terabyte-scale datasets and billions of records efficiently Automate infrastructure provisioning, monitoring, and disaster recovery using tools like Terraform, Kubernetes, and Prometheus. Optimize CI/CD pipelines to ensure seamless deployment of web scraping workflows and infrastructure updates. Develop and maintain stealthy web scrapers using Puppeteer, Playwright, and headless chromium browsers. Reverse-engineer bot-detection mechanisms (e.g., TLS fingerprinting, CAPTCHA solving) and implement evasion strategies. Monitor system health, troubleshoot bottlenecks, and ensure 99.99% uptime for data collection and processing pipelines. Implement security best practices for cloud infrastructure, including intrusion detection, data encryption, and compliance audits. Partner with data collection, ML and SaaS teams to align infrastructure scalability with evolving data needs Preferred candidate profile : 4-7 years of experience in site reliability engineering and cloud infrastructure management Proficiency in Python, JavaScript for scripting and automation . Hands-on experience with Puppeteer/Playwright, headless browsers, and anti-bot evasion techniques . Knowledge of networking protocols, TLS fingerprinting, and CAPTCHA-solving frameworks . Experience with monitoring and observability tools such as Grafana, Prometheus, Elasticsearch, and familiarity with monitoring and optimizing resource utilization in distributed systems. Experience with data lake architectures and optimizing storage using formats such as Parquet, Avro, or ORC. Strong proficiency in cloud platforms (AWS, GCP, or Azure) and containerization/orchestration (Docker, Kubernetes). Deep understanding of infrastructure-as-code tools (Terraform, Ansible) . Deep experience in designing resilient data systems with a focus on fault tolerance, data replication, and disaster recovery strategies in distributed environments. Experience implementing observability frameworks, distributed tracing, and real-time monitoring tools. Excellent problem-solving abilities, with a collaborative mindset and strong communication skills.

Posted 1 month ago

Apply

4.0 - 7.0 years

20 - 35 Lacs

Chennai

Remote

Role & responsibilities: Architect, deploy, and manage scalable cloud environments (AWS/GCP/DO) to support distributed data processing solutions to handle terabyte-scale datasets and billions of records efficiently Automate infrastructure provisioning, monitoring, and disaster recovery using tools like Terraform, Kubernetes, and Prometheus. Optimize CI/CD pipelines to ensure seamless deployment of web scraping workflows and infrastructure updates. Develop and maintain stealthy web scrapers using Puppeteer, Playwright, and headless chromium browsers. Reverse-engineer bot-detection mechanisms (e.g., TLS fingerprinting, CAPTCHA solving) and implement evasion strategies. Monitor system health, troubleshoot bottlenecks, and ensure 99.99% uptime for data collection and processing pipelines. Implement security best practices for cloud infrastructure, including intrusion detection, data encryption, and compliance audits. Partner with data collection, ML and SaaS teams to align infrastructure scalability with evolving data needs Preferred candidate profile : 4-7 years of experience in site reliability engineering and cloud infrastructure management Proficiency in Python, JavaScript for scripting and automation . Hands-on experience with Puppeteer/Playwright, headless browsers, and anti-bot evasion techniques . Knowledge of networking protocols, TLS fingerprinting, and CAPTCHA-solving frameworks . Experience with monitoring and observability tools such as Grafana, Prometheus, Elasticsearch, and familiarity with monitoring and optimizing resource utilization in distributed systems. Experience with data lake architectures and optimizing storage using formats such as Parquet, Avro, or ORC. Strong proficiency in cloud platforms (AWS, GCP, or Azure) and containerization/orchestration (Docker, Kubernetes). Deep understanding of infrastructure-as-code tools (Terraform, Ansible) . Deep experience in designing resilient data systems with a focus on fault tolerance, data replication, and disaster recovery strategies in distributed environments. Experience implementing observability frameworks, distributed tracing, and real-time monitoring tools. Excellent problem-solving abilities, with a collaborative mindset and strong communication skills.

Posted 2 months ago

Apply

3 - 7 years

7 - 10 Lacs

Bengaluru

Remote

• Design and implement scalable, efficient and high-performance data pipelines • Develop and optimize ETL/ELT workflows using modern tools and frameworks. • Work with cloud platforms (AWS, Azure, GCP) Detailed JD will be given later.

Posted 3 months ago

Apply

4.0 - 7.0 years

20 - 35 Lacs

chennai

Remote

Role & responsibilities: Architect, deploy, and manage scalable cloud environments (AWS/GCP/DO) to support distributed data processing solutions to handle terabyte-scale datasets and billions of records efficiently Automate infrastructure provisioning, monitoring, and disaster recovery using tools like Terraform, Kubernetes, and Prometheus. Optimize CI/CD pipelines to ensure seamless deployment of web scraping workflows and infrastructure updates. Develop and maintain stealthy web scrapers using Puppeteer, Playwright, and headless chromium browsers. Reverse-engineer bot-detection mechanisms (e.g., TLS fingerprinting, CAPTCHA solving) and implement evasion strategies. Monitor system health, troubleshoot bottlenecks, and ensure 99.99% uptime for data collection and processing pipelines. Implement security best practices for cloud infrastructure, including intrusion detection, data encryption, and compliance audits. Partner with data collection, ML and SaaS teams to align infrastructure scalability with evolving data needs Preferred candidate profile : 4-7 years of experience in site reliability engineering and cloud infrastructure management Proficiency in Python, JavaScript for scripting and automation . Hands-on experience with Puppeteer/Playwright, headless browsers, and anti-bot evasion techniques . Knowledge of networking protocols, TLS fingerprinting, and CAPTCHA-solving frameworks . Experience with monitoring and observability tools such as Grafana, Prometheus, Elasticsearch, and familiarity with monitoring and optimizing resource utilization in distributed systems. Experience with data lake architectures and optimizing storage using formats such as Parquet, Avro, or ORC. Strong proficiency in cloud platforms (AWS, GCP, or Azure) and containerization/orchestration (Docker, Kubernetes). Deep understanding of infrastructure-as-code tools (Terraform, Ansible) . Deep experience in designing resilient data systems with a focus on fault tolerance, data replication, and disaster recovery strategies in distributed environments. Experience implementing observability frameworks, distributed tracing, and real-time monitoring tools. Excellent problem-solving abilities, with a collaborative mindset and strong communication skills.

Posted Date not available

Apply

8.0 - 13.0 years

7 - 17 Lacs

hyderabad, pune, bengaluru

Work from Office

Proficient in the Braze marketing platform with expertise in data model design and campaign orchestration. Strong background in lifecycle programs, personalization, and marketing data optimization, including list cleansing.

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies