Jobs
Interviews

5 Requests Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a skilled Web Scraping Data Analyst, you will be responsible for collecting, cleaning, and analyzing data from various online sources. Your expertise in Python-based scraping frameworks, data transformation, and experience with proxy/VPN rotation and IP management will be crucial in building data pipelines that support our analytics and business intelligence initiatives. Your key responsibilities will include designing, developing, and maintaining robust web scraping scripts using tools like Python, BeautifulSoup, Scrapy, Selenium, etc. You will also implement IP rotation, proxy management, and anti-bot evasion techniques, deploy scraping tools on cloud-based or edge servers, and monitor scraping jobs for uptime and efficiency. Additionally, you will parse and structure unstructured or semi-structured web data into clean, usable datasets, collaborate with data analysts and data engineers to integrate web-sourced data into internal databases and reporting systems, conduct exploratory data analysis (EDA), and ensure compliance with website scraping policies, robots.txt, and relevant data privacy regulations. To excel in this role, you should have proficiency in Python and experience with libraries like Requests, BeautifulSoup, Scrapy, Pandas. Knowledge of proxy/VPN usage, IP rotation, and web traffic routing tools (e.g., Smartproxy, BrightData, Tor, etc.), familiarity with cloud platforms (AWS, Azure, or GCP) and Linux-based environments, experience deploying scraping scripts on edge servers or containerized environments (e.g., Docker), solid understanding of HTML, CSS, JSON, and browser dev tools for DOM inspection, strong analytical mindset with experience in data cleansing, transformation, and visualization, good knowledge of SQL and basic data querying, and ability to handle large volumes of data and build efficient data pipelines. Preferred qualifications for this role include experience with headless browsers like Puppeteer or Playwright, familiarity with scheduling tools like Airflow or Cron, background in data analytics or reporting using tools like Tableau, Power BI, or Jupyter Notebooks, and knowledge of anti-captcha solutions and browser automation challenges. This is a full-time position with the work location being in person.,

Posted 1 day ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a skilled Web Scraping Data Analyst, your primary responsibility will involve collecting, cleaning, and analyzing data from various online sources. You will leverage your expertise in Python-based scraping frameworks to design, develop, and maintain robust web scraping scripts using tools such as Python, BeautifulSoup, Scrapy, Selenium, and more. Additionally, you will be tasked with implementing IP rotation, proxy management, and anti-bot evasion techniques to ensure efficient data collection. Your role will be instrumental in constructing data pipelines that drive our analytics and business intelligence initiatives. Collaboration will be a key aspect of your work as you engage with data analysts and data engineers to integrate web-sourced data into internal databases and reporting systems. Furthermore, you will be involved in conducting exploratory data analysis (EDA) to derive valuable insights from the scraped data. It will be essential to adhere to website scraping policies, robots.txt guidelines, and relevant data privacy regulations to ensure compliance. To excel in this role, you should possess proficiency in Python and have experience with libraries like Requests, BeautifulSoup, Scrapy, and Pandas. Knowledge of proxy/VPN usage, IP rotation, and web traffic routing tools will be crucial for effective data collection. Familiarity with cloud platforms such as AWS, Azure, or GCP, as well as Linux-based environments, will be advantageous. Experience in deploying scraping scripts on edge servers or containerized environments and a solid understanding of HTML, CSS, JSON, and browser dev tools are also desirable skills. A strong analytical mindset coupled with experience in data cleansing, transformation, and visualization will be beneficial in handling large volumes of data and building efficient data pipelines. Proficiency in SQL and basic data querying will be necessary for data manipulation tasks. Preferred qualifications include experience with headless browsers like Puppeteer or Playwright, familiarity with scheduling tools like Airflow or Cron, and a background in data analytics or reporting using tools like Tableau, Power BI, or Jupyter Notebooks. This full-time role requires an in-person work location.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

In the Development & IT department, you will be responsible for the development, support, maintenance, and operation of Adform Flow. The primary focus is to ensure the delivery of a stable and competitive product to clients, emphasizing client value both internally and externally. As a part of the AI efforts, you will collaborate with a cross-functional international Agile team comprising data scientists, engineers, product managers, and leaders across various global locations. You are expected to: - Be part of the Gen AI team driving customer behavior change towards intuitiveness - Develop RAG pipelines for AI Model Training - Collaborate closely with development and product teams for software implementation - Act as a technical challenger to enhance team knowledge and mutual improvement opportunities To excel in this role, you should possess: - A Bachelor's degree in Computer Science or equivalent Engineering - Proficiency in Python Development and working knowledge of libraries like pandas, requests, Numpy, etc. - Familiarity with LLM frameworks such as Langchain or LlamaIndex is a plus - Experience with web frameworks like FastAPI or Flask for API development - Ability to build microservices using GQL/REST APIs for model access - Proficiency in building CI/CD pipelines and experience with Github Action - Knowledge of Docker and Kubernetes for containerization and orchestration - Strong understanding of Postgres SQL with vector search extensions like pgvector for Storage - Effective communication and documentation skills - Open & Caring, Agile and Innovative, Ownership Mindset Additional qualifications that will make you stand out include: - Knowledge of web technologies and Ad Tech industry solutions As part of the team, you can expect: - Short-term international travel requirements - Daily learning and growth opportunities with exposure to modern technologies - Collaboration with experts from different countries and offices worldwide - Working in a professional, dynamic, and client-focused environment - Inclusive and diverse work environment commitment - Challenging tasks with a creative problem-solving approach and ownership - Team collaboration to achieve common goals - Ambitious growth opportunities shared with you - Encouragement of courage and learning from mistakes - Health insurance, extra vacation days, enhanced maternity, and paternity leave policies Adform is a global, independent advertising platform designed for modern marketing. The Adform FLOW technology offers a user-friendly experience and a scalable, modular, and open architecture for seamless campaign lifecycle management. Clients benefit from enhanced control, transparency, and data ownership throughout their advertising operations. Since 2002, Adform has been leveraging technology to enhance business results for clients worldwide through human/machine collaboration and augmented intelligence.,

Posted 1 week ago

Apply

3.0 - 8.0 years

0 - 2 Lacs

Chennai

Remote

JD:- Web scraping of Static & dynamic video sources from various websites of different geographical locations using Python 3.x with Selenium framework. Mandatory Skills : Python 3.x, Selenium, Front end knowledge ( mainly HTML, CSS), regex, selectors( mainly CSS selector, XPath selector) and GitLab. Secondary Skills: Python Packages (mainly Pandas, UrlLib),SQL, JS, Excel and Good oral & written communication skills. 1. Python 3.x 2. Various types of selectors (mainly CSS & XPath) 3. Regex 4. Expertise in Web Scraping & associated framework(mainly Selenium) 5. Xpath Selector Good to Have skills:- 1. SQL ( Basic knowledge to develop complex query) 2. Python packages (mainly Panda, UrlLib,Requests)

Posted 1 week ago

Apply

4.0 - 9.0 years

14 - 22 Lacs

Pune

Work from Office

Responsibilities: * Design, develop, test and maintain scalable Python applications using Scrapy, Selenium and Requests. * Implement anti-bot systems and data pipeline solutions with Airflow and Kafka. Share CV on recruitment@fortitudecareer.com Flexi working Work from home

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies