Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
5 - 15 Lacs
bengaluru
Remote
Role & responsibilities As a Data Engineer focused on web crawling and platform data acquisition, you will design,develop, and maintain large-scale web scraping pipelines to extract valuable platform data. Youwill be responsible for implementing scalable and resilient data extraction solutions, ensuringseamless data retrieval while working with proxy management, anti-bot bypass techniques, anddata parsing. Optimizing scraping workflows for performance, reliability, and efficiency will be akey part of your role. Additionally, you will ensure that all extracted data maintains high qualityand integrity. Preferred candidate profile We are seeking candidates with: Strong experience in Python and web scraping frameworks such as Scrapy, Selenium, Playwright, or BeautifulSoup. Knowledge of distributed web crawling architectures and job scheduling. Familiarity with headless browsers, CAPTCHA-solving techniques, and proxy management to handle dynamic web challenges. Experience with data storage solutions, including SQL, and cloud storage. Understanding of big data technologies like Spark and Kafka (a plus). Strong debugging skills to adapt to website structure changes and blockers. A proactive, problem-solving mindset and ability to work effectively in a team-driven environment.
Posted 10 hours ago
0.0 - 1.0 years
2 - 3 Lacs
gurugram
Work from Office
Selected Intern's Day-to-day Responsibilities Include. Developing and maintaining web applications using Python frameworks such as Django or Flask. Participating in web scraping projects to gather and process data from various sources. Working with MongoDB to store, retrieve, and manage data efficiently. Implementing and managing cron jobs-celery for scheduled tasks and automation. Supporting cloud infrastructure setup and maintenance on AWS. Collaborating with the development team to design, develop, and test software solutions. Debugging and resolving technical issues. Documenting code and processes for future reference. Staying up-to-date with the latest industry trends and technologies. About CompanyAt Lifease Solutions LLP, we believe that design and technology are the perfect blend to solve any problem and bring any idea to life. Lifease Solutions is a leading provider of software solutions and services that help businesses succeed. Based in Noida, India, we are committed to delivering high-quality, innovative solutions that drive value and growth for our customers. Our expertise in the finance, sports, and capital market domain has made us a trusted partner for companies around the globe. We take pride in our ability to turn small projects into big successes, and we are always looking for ways to help our clients maximize their IT investments
Posted 1 day ago
2.0 - 4.0 years
2 - 4 Lacs
himatnagar
Work from Office
Responsibilities: * Design, develop & maintain web scrapers using Python & Scrapy * Ensure data accuracy & compliance with website terms * Collaborate with cross-functional teams on project requirements Work from home
Posted 1 day ago
3.0 - 8.0 years
0 - 1 Lacs
chandigarh, pune, mumbai (all areas)
Work from Office
JD for Python + Web Scraping Mumbai- Nariman Point Express Tower/Pune-Hinjewadi Phase 2/Chandigarh-Rajib Gandhi Technology park (WFO) Requirement [3-7 Years experience in Web Scraping] Design, develop, and maintain web scraping scripts using Python. Extract, clean, and process large datasets from multiple sources. Identify and troubleshoot scraping issues and implement solutions. Ensure scrapers adhere to website policies and legal guidelines. Collaborate with data analysts and other team members to meet data requirements. Experience with various proxies & able to troubleshoot blocking Optimize web scrapers for speed, reliability, and accuracy. Regularly update scrapers to adapt to website changes. Proven experience in web scraping and data extraction using Python. Skills Python BeautifulSoup/Scrapy Selenium JavaScript SQL/MongoDB/MySQL
Posted 2 days ago
0.0 - 5.0 years
4 - 9 Lacs
thane
Remote
Testing and debugging applications. Developing back-end components. Integrating user-facing elements using server-side logic. Assessing and prioritizing client feature requests. Integrating data storage solutions. Required Candidate profile Knowledge of Python and related frameworks including Django and Flask. A deep understanding and multi-process architecture and the threading limitations of Python. Perks and benefits Flexible hours. Remote work options.
Posted 2 days ago
3.0 - 7.0 years
14 - 18 Lacs
mumbai, pune, chennai
Work from Office
Project description Develop scalable data collection, storage, and distribution platform to house data from vendors, research providers, exchanges, PBs, and web-scraping. Make data available to systematic & fundamental PMs, and enterprise functionsOps, Risk, Trading, and Compliance. Develop internal data products and analytics Responsibilities Web scraping using scripts/APIs/Tools Help build and maintain greenfield data platform running on Snowflake and AWS Understand the existing pipelines and enhance pipelines for the new requirements. Onboarding new data providers Data migration projects Skills Must have 10+ years of exp as Data Engineer SQL Python Linux Containerization(Docker, Kubernetes) Good communication skills AWS Strong on Dev ops side of things(K8s, Docker, Jenkins) Being ready to work in EU time zone Capital markets exp Nice to have Market Data Projects Snowflake is a big plus Airflow Location - pune,mumbai,chennai,banagalore
Posted 3 days ago
7.0 - 12.0 years
25 - 30 Lacs
indore, nagpur, hyderabad
Hybrid
A Python Lead with expertise in Django and AWS holds a pivotal role in the development and deployment of web applications, encompassing both technical leadership and hands-on contributions. Required Candidate profile Python Development and Implementation: Architecture and Design: Cloud Infrastructure Management: Team Management and Collaboration:
Posted 3 days ago
7.0 - 12.0 years
25 - 30 Lacs
noida, gurugram, greater noida
Hybrid
A Python Lead with expertise in Django and AWS holds a pivotal role in the development and deployment of web applications, encompassing both technical leadership and hands-on contributions. Required Candidate profile Python Development and Implementation: Architecture and Design: Cloud Infrastructure Management: Team Management and Collaboration:
Posted 3 days ago
7.0 - 12.0 years
25 - 30 Lacs
hyderabad, chennai, bengaluru
Hybrid
A Python Lead with expertise in Django and AWS holds a pivotal role in the development and deployment of web applications, encompassing both technical leadership and hands-on contributions. Required Candidate profile Python Development and Implementation: Architecture and Design: Cloud Infrastructure Management: Team Management and Collaboration:
Posted 3 days ago
0.0 - 4.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Python Developer with a specialization in web scraping, you will play a crucial role within our development team at XcelTec Interactive Pvt. Ltd. This position is perfect for individuals who have recently completed their education and are eager to delve into the world of Python programming while focusing on extracting and processing data from various websites efficiently. Your primary responsibilities will include developing and managing web scraping scripts utilizing Python libraries such as BeautifulSoup and Scrapy. You will be tasked with extracting, parsing, and cleansing data from diverse websites, ensuring accuracy and performance through rigorous testing and debugging of your scraping scripts. Collaboration with team members to achieve project objectives and data requirements will be a key aspect of your role. To excel in this position, you should possess a Bachelor's degree in Computer Science, Information Technology, or a related field. A foundational understanding of Python programming and its related libraries is essential, along with familiarity with HTML, CSS, and JavaScript to comprehend webpage structures. An appreciation of web scraping principles and ethical practices is also crucial, coupled with strong analytical and problem-solving capabilities. Effective time management and collaboration skills will be necessary to thrive in this role. Furthermore, candidates are required to have completed their graduation and not be enrolled in any ongoing academic programs. Having a portfolio or examples of Python or web scraping projects, whether academic or personal, will be considered advantageous. If you are enthusiastic about Python development and web scraping, and meet the qualifications mentioned above, we encourage you to apply for this exciting opportunity. Please note that this is a full-time, permanent position for freshers, based at our office located at 301, 3rd Floor, Sheth Corporate Tower, Near Nagri Hospital, Ellis Bridge, Ahmedabad 380009. You can find more information about our company at www.xceltec.com. For further inquiries or to submit your application, you may contact our HR department at 9879691209 or 9879698003. We look forward to welcoming a dedicated and passionate Python Developer to our team at XcelTec Interactive Pvt. Ltd.,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
About Us: Our data and research teams at YipitData transform raw data into strategic intelligence, delivering accurate, timely, and deeply contextualized analysis that our customers, ranging from the world's top investment funds to Fortune 500 companies, depend on to drive high-stakes decisions. From sourcing and licensing novel datasets to rigorous analysis and expert narrative framing, our teams ensure clients get not just data, but clarity and confidence. We operate globally with offices in the US (NYC, Austin, Miami, Mountain View), APAC (Hong Kong, Shanghai, Beijing, Guangzhou, Singapore), and India. Our award-winning, people-centric culture recognized by Inc. as a Best Workplace for three consecutive years emphasizes transparency, ownership, and continuous mastery. What It's Like to Work at YipitData: YipitData isn't a place for coastingit's a launchpad for ambitious, impact-driven professionals. From day one, you'll take the lead on meaningful work, accelerate your growth, and gain exposure that shapes careers. Why Top Talent Chooses YipitData: - Ownership That Matters: You'll lead high-impact projects with real business outcomes. - Rapid Growth: We compress years of learning into months. - Merit Over Titles: Trust and responsibility are earned through execution, not tenure. - Velocity with Purpose: We move fast, support each other, and aim highalways with purpose and intention. If your ambition is matched by your work ethic and you're hungry for a place where growth, impact, and ownership are the norm, YipitData might be the opportunity you've been waiting for. About The Role: We are seeking a Web Scraping Engineer to join our growing engineering team. In this hands-on role, you'll take ownership of designing, building, and maintaining robust web scrapers that power critical reports and customer experiences across our organization. You will work on complex, high-impact scraping challenges and collaborate closely with cross-functional teams to ensure our data ingestion processes are resilient, efficient, and scalable, while delivering high-quality data to our products and stakeholders. As Our Web Scraping Engineer You Will: - Refactor and Maintain Web Scrapers: Overhaul existing scraping scripts to improve reliability, maintainability, and efficiency. Implement best coding practices (clean code, modular architecture, code reviews, etc.) to ensure quality and sustainability. - Implement Advanced Scraping Techniques: Utilize sophisticated fingerprinting methods (cookies, headers, user-agent rotation, proxies) to avoid detection and blocking. Handle dynamic content, navigate complex DOM structures, and manage session/cookie lifecycles effectively. - Collaborate with Cross-Functional Teams: Work closely with analysts and other stakeholders to gather requirements, align on targets, and ensure data quality. Provide support, documentation, and best practices to internal stakeholders to ensure effective use of our web scraped data in critical reporting workflows. - Monitor and Troubleshoot: Develop robust monitoring solutions, alerting frameworks to quickly identify and address failures. Continuously evaluate scraper performance, proactively diagnosing bottlenecks and scaling issues. - Drive Continuous Improvement: Propose new tooling, methodologies, and technologies to enhance our scraping capabilities and processes. Stay up to date with industry trends, evolving bot-detection tactics, and novel approaches to web data extraction. This is a fully-remote opportunity based in India. Standard work hours are from 11am to 8pm IST, but there is flexibility here. You Are Likely To Succeed If: - Effective communication in English with both technical and non-technical stakeholders. - You have a track record of mentoring engineers and managing performance in a fast-paced environment. - 3+ years of experience with web scraping frameworks (e.g., Selenium, Playwright, or Puppeteer). - Strong understanding of HTTP, RESTful APIs, HTML parsing, browser rendering, and TLS/SSL mechanics. - Expertise in advanced fingerprinting and evasion strategies (e.g., browser fingerprint spoofing, request signature manipulation). - Deep experience managing cookies, headers, session states, and proxy rotations, including the deployment of both residential and data center proxies. - Experience with logging, metrics, and alerting to ensure high availability. - Troubleshooting skills to optimize scraper performance for efficiency, reliability, and scalability. What We Offer: Our compensation package includes comprehensive benefits, perks, and a competitive salary. We care about your personal life, and we mean it. We offer flexible work hours, flexible vacation, a generous 401K match, parental leave, team events, wellness budget, learning reimbursement, and more! Your growth at YipitData is determined by the impact that you are making, not by tenure, unnecessary facetime, or office politics. Everyone at YipitData is empowered to learn, self-improve, and master their skills in an environment focused on ownership, respect, and trust. See more on our high-impact, high-opportunity work environment above! Job Applicant Privacy Notice,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
ACL Digital is currently seeking an experienced Senior Automation Tester - Network Security to join the team. The ideal candidate should have a minimum of 5 years of industry experience with hands-on expertise in End-to-End Solution testing in the Security SDWAN domain. Key Requirements: - Strong understanding of network security concepts, protocols, and technologies. - Experience in Quality Assurance testing of VPN technologies such as IKev2, IKev1, IPSec, SSL/TLS. - Proficiency in testing SD-WAN technologies and solutions. - Familiarity with network devices, L2/L3 protocols, and traffic generation tools like Ixia and Spirent. - Knowledge of next-generation network security standards (e.g., Post-Quantum Cryptography) and best practices. - Proficient in Python and its standard libraries, specifically in developing APIs for infra automation. - Hands-on experience with automation tools and frameworks like Selenium and Rest API. - Solid understanding of RESTful APIs, web scraping, and automation of web-based systems. - Experience working with version control systems like Git and CI/CD tools such as Jenkins and GitHub Actions is a plus. - Ability to collaborate effectively with key stakeholders throughout the software development life cycle. - Motivated self-starter with excellent communication skills and a track record of delivering high-quality products within tight deadlines. Soft Skills: - Strong communication and organizational skills. - Proven ability to lead and deliver superior products in a team environment with challenging timelines. - Excellent problem-solving skills and system design expertise in building large-scale distributed systems. Immediate joiners are preferred. Interested candidates can share their resumes at zahid.h@acldigital.com.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
jaipur, rajasthan
On-site
You will be joining our team as a highly motivated and experienced candidate. Your main responsibilities will include developing and implementing AI models using Python, working with Large Language Model (LLM) APIs, and utilizing them effectively in solutions. You will be expected to analyze problems, frame solutions, and actively contribute valuable ideas during project discussions. Additionally, debugging and solving issues, including novel errors encountered in generative AI work, will be an essential part of your role. It is crucial to communicate effectively with stakeholders, explaining technical details and decisions, and apply machine learning techniques to various projects. To excel in this role, you should have proficiency in Python programming and hands-on experience with LLM APIs. Basic knowledge of cloud computing, with a preference for familiarity with Azure, is also required. Experience in web scraping, some knowledge of prompt engineering, and excellent problem-solving skills will be beneficial. You should have the ability to actively listen and provide constructive ideas, along with the eagerness to learn and stay updated with the latest trends in AI and technology. Strong communication skills, both verbal and written, are essential for explaining complex concepts to non-technical stakeholders. A fundamental understanding of machine learning models is also expected. This is a full-time position, and the benefits include health insurance and Provident Fund.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
As a Web Scraping Engineer, you will join our dynamic team and play a crucial role in driving our data-driven strategies. Your primary responsibility will be to develop and maintain innovative solutions to automate data extraction, parsing, and structuring from various online sources. By utilizing your expertise in web scraping, you will empower our business intelligence, market research, and decision-making processes. Key Responsibilities - Design, implement, and maintain web scraping solutions to collect structured data from publicly available online sources and APIs. - Parse, clean, and transform extracted data to ensure accuracy and usability for business needs. - Store and organize collected data in databases or spreadsheets for easy access and analysis. - Monitor and optimize scraping processes for efficiency, reliability, and compliance with relevant laws and website policies. - Troubleshoot issues related to dynamic content, anti-bot measures, and changes in website structure. - Collaborate with data analysts, scientists, and other stakeholders to understand data requirements and deliver actionable insights. - Document processes, tools, and workflows for ongoing improvements and knowledge sharing. Requirements - Proven experience in web scraping, data extraction, or web automation projects. - Proficiency in Python or similar programming languages, and familiarity with libraries such as BeautifulSoup, Scrapy, or Selenium. - Strong understanding of HTML, CSS, JavaScript, and web protocols. - Experience with data cleaning, transformation, and storage (e.g., CSV, JSON, SQL/NoSQL databases). - Knowledge of legal and ethical considerations in web scraping, with a commitment to compliance with website terms of service and data privacy regulations. - Excellent problem-solving and troubleshooting skills. - Ability to work independently and manage multiple projects simultaneously. Preferred Qualifications - Experience with cloud platforms (AWS, GCP, Azure) for scalable data solutions. - Familiarity with workflow automation and integration with communication tools (e.g., email, Slack, APIs). - Background in market research, business intelligence, or related fields. Skills: data extraction, data cleaning, BeautifulSoup, business intelligence, web automation, JavaScript, web scraping, data privacy regulations, web protocols, Selenium, Scrapy, SQL, data transformation, NoSQL, CSS, market research, automation, Python, HTML.,
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
pune, maharashtra
On-site
As a Web Scraper, you will be responsible for leveraging your expertise to collect data from various online sources. Your duties will include developing robust web scrapers and parsers for different websites, extracting both structured and unstructured data, and storing it in SQL or NoSQL databases. You will collaborate closely with Project, Business, and Research teams to provide them with the scraped data for analysis. Your role will also involve maintaining the scraping projects that have been deployed to production, as well as creating frameworks to automate and ensure a continuous flow of data from multiple sources. Working independently with minimal supervision, you will need to gain a deep understanding of web data sources and be able to identify, extract, parse, and store relevant data effectively. ### Responsibilities: - Apply your knowledge to fetch data from multiple online sources - Develop reliable web scrapers and parsers for various websites - Extract structured/unstructured data and store it in SQL/NoSQL databases - Collaborate with Project/Business/Research teams to provide scraped data for analysis - Maintain scraping projects in production - Create frameworks for automating and ensuring a constant flow of data from multiple sources - Work independently with minimal supervision - Develop a deep understanding of web data sources to identify, extract, parse, and store data effectively ### Required Skills And Experience: - 1 to 3 years of experience as a Web Scraper - Proficient in Python and familiar with Web Crawling/Web scraping using Python Requests, Beautifulsoup or URLlib, Selenium, and Playwright - Strong knowledge of basic Linux commands for system navigation, management, and troubleshooting - Expertise in proxy usage for secure and efficient network operations - Experience with captcha-solving techniques for automation and data extraction - Knowledge of data parsing techniques - Proficiency in Regular Expressions, HTML, CSS, DOM, and XPATH - Familiarity with JavaScript is a plus - Ability to access, manipulate, and transform data from various database and flat file sources - Essential skills in MongoDB & MySQL - Ability to develop reusable code-based scraping products for broader use - Proficiency in GIT for version control and collaborative development workflows - Experience with cloud servers on platforms like AWS, GCP, and LEAPSWITCH for scalable and reliable infrastructure management - Strong problem-solving skills and innovative thinking - Ability to deliver results in a clear and understandable manner for clients ### Note: This job description is sourced from hirist.tech.,
Posted 2 weeks ago
0.0 - 4.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Full-stack Engineer on our team, you will play a pivotal role in shaping both the backend and frontend of our product. If you're excited about taking ownership of the entire stack, making key technical decisions, and creating exceptional user experiences, this role is for you. Specialized in a specific part of the stack We value depth as much as versatility so don't hesitate to reach out. Design, develop, and maintain RESTful APIs, socket-based APIs and backend services using Node.js. Build and optimize scalable database systems with PostgreSQL to handle large-scale traffic and data. Architect and maintain our AWS infrastructure to ensure scalability, reliability, and security. Collaborate with front-end developers to integrate APIs seamlessly with the user interface. Develop responsive and visually stunning user interfaces using React.js, ensuring a smooth and intuitive user experience. Take ownership of UI/UX design decisions, making sure the product is both functional and aesthetically pleasing. Implement web scraping solutions to gather and integrate data from external sources. Ensure code quality through testing, debugging, and performance tuning across the stack. Lead technical discussions and contribute to architectural decisions for both backend and frontend systems. Join a fast-paced start-up environment where you'll have the opportunity to learn, grow, and make a significant impact. Work on challenging problems that require innovative solutions, collaborating with a talented and driven team. Enjoy a culture of continuous improvement, where your ideas and contributions are valued and rewarded. BS/MS degree in Computer Science, Engineering, or a related subject.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You are as unique as your background, experience and point of view. Here, you'll be encouraged, empowered and challenged to be your best self. You'll work with dynamic colleagues - experts in their fields - who are eager to share their knowledge with you. Your leaders will inspire and help you reach your potential and soar to new heights. Every day, you'll have new and exciting opportunities to make life brighter for our Clients - who are at the heart of everything we do. Discover how you can make a difference in the lives of individuals, families and communities around the world. Role Summary: Micro Automation Specialist a full-time developer who will be joining one of the Operations team of highly skilled professionals who focus on delivering solutions to the Clients. Candidate would be required to deliver desktop applications and automated solutions targeted at providing value to Sun Life Financial through innovation and collaboration. Candidate would be required to provide Solutions spanning a wide range of technologies including Robotic Automation, Microsoft Access, Microsoft Excel and Microsoft SQL Server. The candidate will be encouraged to take advantage of the opportunities for both personal growth and continuous learnings provided by the team and by Sun Life Financial. Key Accountabilities: - Analyze requirements and work with Business stakeholders to understand their requests. - Develop new applications functionality and enhance/maintain current features. - Responsible for the creation, design, development and implementation of RPA systems. - Required to investigate, analyze, and set-up automated processes to maximize efficiency for the business model. - Work with ORM Team to Review MA Tool. - Provides support and assists the BPA Team. - Build a strong relationship with business unit leaders and onshore partners. Skill/Competencies: - Minimum of 5+ years of industry experience in VBA & SQL. - High proficiency in MS Office applications such as Excel, Access and PowerPoint. - Strong with Microsoft SQL and relational databases. - Must have Design Skills to create user interfaces. - VBA + DOT Net programming experience with any database. - Knowledge of PEGA Automation Tool is preferred. - Well versed working in an Agile environment. - Strong analytical, problem-solving, and process improvement skills. - Should have knowledge of Web Scraping. - Knowledge of data visualization best practices (Capability to create visualizations, dashboards, and MIS). Job Category: Customer Service / Operations Posting End Date: 29/06/2025,
Posted 2 weeks ago
0.0 - 4.0 years
0 Lacs
delhi
On-site
The selected intern will be responsible for collecting and extracting data from primary and secondary sources using automated tools, databases, APIs, or web scraping techniques. They will assist in cleaning, preprocessing, and organizing data for analysis and reporting purposes. The intern will support the identification of patterns, trends, and anomalies in structured and unstructured datasets. Utilizing tools such as Python, Excel, SQL, or R, they will process large volumes of data. Collaborating with analysts, the intern will contribute to building and maintaining data dashboards, visualizations, and reports. Additionally, they will assist in applying data mining techniques such as classification, clustering, regression, and association rule mining. The intern will also help maintain documentation and data integrity standards to ensure accurate results. Furthermore, they will conduct competitor or market intelligence research using data scraping and public databases (if applicable) and stay updated on data privacy regulations to ensure ethical handling of sensitive data. LogicizeIP is devoted to helping individual entrepreneurs and inventors, institutions, universities, and companies secure their IP rights for their ideas, inventions, and designs at an affordable cost worldwide. The company's mission is to make IP services easier for both new and existing users to protect their creations conveniently and hassle-free. LogicizeIP ensures that your idea always belongs to you and is completely protected, aiming to encourage inventiveness and shield it.,
Posted 2 weeks ago
0.0 - 3.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be joining our team as a Junior Data Scraping Specialist at one of the leading IT product companies. In this role, you will receive training and mentorship to support in collecting, cleaning, and transforming data from various online sources. This position offers an excellent opportunity to initiate your career in data engineering and web scraping. Your responsibilities will include assisting in data collection by scraping information from websites using basic scraping tools and techniques. You will learn to utilize popular web scraping libraries and tools, clean and organize raw data into structured formats like CSV, JSON, or Excel files, and support in automating data scraping tasks through simple scripts and tools guided by senior team members. Additionally, you will conduct research to identify potential data sources, test scraping scripts, and collaborate closely with data analysts and developers to gather data for reports or analysis. To excel in this role, you should have basic knowledge of HTML, CSS, and web structures, problem-solving abilities to troubleshoot data extraction issues, attention to detail to ensure data accuracy, eagerness to learn new skills related to web scraping and data processing, and excellent communication skills for effective collaboration within the team. This position requires 0-1 year of experience, making it an ideal opportunity for individuals looking to kick-start their career in the field of data engineering and web scraping.,
Posted 3 weeks ago
0.0 - 4.0 years
0 Lacs
karnataka
On-site
As an individual contributing to the team, you will assist in the development of web scraping and automation scripts under supervision. You will have the opportunity to learn and implement basic scraping techniques for both static and dynamic websites. Collaboration with the team will be crucial as you work together to build and test data ingestion pipelines. Your responsibility will include writing clean and maintainable Python code for small to medium-sized tasks. Additionally, you will be expected to debug and resolve basic issues in scraping workflows and scripts. Working with APIs such as REST and GraphQL to fetch and ingest data will be a part of your regular tasks. Documenting your code and contributing to the maintenance of internal knowledge bases will also be an essential aspect of your role. In terms of your basic programming skills, you should have a good understanding of Python fundamentals, including modules, functions, loops, and error handling. Familiarity with libraries like `requests`, `BeautifulSoup`, and `XML` will be beneficial for your tasks. Having a grasp of web development basics, including HTML, CSS, JavaScript, as well as concepts like XPath and DOM, will be advantageous. Basic knowledge of JSON and CSV file formats for data handling is expected, along with the ability to perform simple operations on databases like MySQL or MongoDB. Familiarity with Git for basic version control tasks like commits, branches, and pulls is also desirable. Join our team at an IT Consulting and Services company that specializes in cutting-edge solutions in Data Analytics, AI, and Web Scraping technologies.,
Posted 3 weeks ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
About Us Thoucentric provides end-to-end consulting solutions to tackle diverse business challenges across industries. With a focus on leveraging deep domain expertise, cutting-edge technology, and a results-driven approach, we assist organizations in optimizing operations, improving decision-making, and fostering growth. Headquartered in Bangalore, we have a global presence in India, US, UK, Singapore, and Australia. Our services span Business Consulting, Program & Project Management, Digital Transformation, Product Management, Process & Technology Solutioning, and Execution, encompassing areas such as Analytics & Emerging Tech in functional domains like Supply Chain, Finance & HR, Sales & Distribution. We pride ourselves on executing solutions rather than just offering advice, collaborating with leading names in the CPG industry, tech sector, and start-up ecosystem. Recognized as a "Great Place to Work" by AIM and ranked among the "50 Best Firms for Data Scientists to Work For", we boast a seasoned consulting team of over 500 professionals across six global locations. Job Description About the Role We are in search of a BI Architect to support the BI Lead of a global CPG organization by designing an intelligent and scalable Business Intelligence ecosystem. The role involves crafting an enterprise-wide KPI dashboard suite enhanced by a GenAI-powered natural language interface for insightful exploration. Responsibilities - Architect BI Stack: Develop and supervise a scalable and efficient BI platform serving as the central source for critical business metrics across functions. - Advise BI Lead: Collaborate as a technical advisor to the BI Lead, ensuring alignment of architecture decisions with long-term strategies and business priorities. - Design GenAI Layer: Create a GenAI-driven natural language interface for BI dashboards to enable conversational querying of KPIs, trends, and anomalies by business users. - RAG/Graph Approach: Implement suitable architectures like RAG with vector stores or Knowledge Graphs to deliver intelligent, context-rich insights. - External Data Integration: Establish mechanisms for organizing and integrating data from external sources (e.g., competitor websites, industry reports) to enhance internal insights. - Security & Governance: Maintain adherence to enterprise data governance, security, and compliance standards across all layers (BI + GenAI). - Cross-functional Collaboration: Engage closely with Data Engineering, Analytics, and Product teams to ensure seamless integration and operationalization of the BI ecosystem. Requirements Qualifications - 9 years of BI architecture and analytics platform experience, with at least 2 years focused on GenAI, RAG, or LLM-based solutions. - Profound expertise in BI tools (e.g., Power BI, Tableau, Looker) and data modeling. - Familiarity with GenAI frameworks (e.g., LangChain, LlamaIndex, Semantic Kernel) and vector databases (e.g., Pinecone, FAISS, Weaviate). - Knowledge of graph-based data models and tools (e.g., Neo4j, RDF, SPARQL) is advantageous. - Proficiency in Python or relevant scripting language for pipeline orchestration and AI integration. - Experience in web scraping and structuring external/third-party datasets. - Previous exposure to CPG domain or large-scale KPI dashboarding is preferred. Benefits Joining Thoucentric as a Consultant offers: - Opportunity to shape your career path independently. - Engaging consulting environment working with Fortune 500 companies and startups. - Supportive and dynamic workplace fostering personal growth. - Inclusive culture with opportunities for bonding beyond work. - Participation in Thoucentric's growth journey. Skills Required: BI architecture, Analytics, Data Visualization Practice Name: Data Visualization Date Opened: 07/15/2025 Work Mode: Hybrid Job Type: Full time Industry: Consulting Corporate Office: Thoucentric, The Hive, Mahadevapura, Bengaluru, Karnataka, India, 560048,
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
YipitData is a leading market research and analytics firm in the disruptive economy that has recently secured up to $475M from The Carlyle Group, valuing over $1B. For three consecutive years, YipitData has been recognized as one of Inc's Best Workplaces, fostering a people-centric culture centered around mastery, ownership, and transparency. As a fast-growing technology company with offices in various locations worldwide, including NYC, Austin, Miami, and others, we offer exciting opportunities for individuals looking to make a significant impact. As a Web Scraping Specialist at YipitData, you will be part of the Data Solutions team, reporting directly to the Data Solutions Engineering Manager. Your role will involve designing, refactoring, and maintaining web scrapers crucial for generating key reports across the organization. By ensuring the efficiency, reliability, and scalability of our data ingestion processes, you will directly support multiple business units and products. Key Responsibilities: - Refactor and maintain web scrapers to enhance reliability, maintainability, and efficiency. - Implement advanced scraping techniques using fingerprinting methods to evade detection and blocking. - Collaborate with cross-functional teams to gather requirements, align on targets, and ensure data quality. - Monitor, troubleshoot, and optimize scraper performance to address failures and bottlenecks. - Drive continuous improvement by proposing new methodologies and technologies to enhance scraping capabilities. To succeed in this role, effective communication skills in English, a minimum of 4 years of experience with web scraping frameworks, and a strong understanding of HTTP, RESTful APIs, and HTML parsing are essential. Additionally, expertise in advanced fingerprinting and evasion strategies, managing cookies, headers, and proxies, as well as troubleshooting skills for optimizing scraper performance, are required. This is a fully-remote position based in India with flexible working hours from 11am to 8pm IST. At YipitData, we offer a competitive compensation package, comprehensive benefits, and a supportive work environment focused on personal growth and skill mastery. Your impact and contributions are what drive your growth at YipitData, where learning, self-improvement, and skill development are encouraged and valued.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
kolkata, west bengal
On-site
As a global leader in assurance, tax, transaction, and advisory services, EY is dedicated to hiring and developing passionate professionals to contribute to building a better working world. At EY, you are offered a culture that values training, opportunities, and creative freedom, focusing not just on your current abilities but also on your potential for growth. Your career at EY is yours to shape, with limitless possibilities, motivating experiences, and continuous support to help you reach your professional best. EY has an exciting opportunity for the position of Senior Consultant in the National-Assurance-ASU - Audit - Standards and Methodologies team based in Kolkata. The Assurance team at EY aims to inspire confidence and trust in a complex world by protecting the public interest, promoting transparency, supporting investor confidence, and fostering talent for future business leaders. You will be involved in ensuring compliance with audit standards, providing clear perspectives to audit committees, and delivering critical information for stakeholders. Your responsibilities will include demonstrating technical excellence in audit analytics, foundational analytics, visualization, data extraction, risk-based analytics, and business understanding. Proficiency in databases, ETL, SQL, visualization tools like Tableau, Spotfire, or Qlikview, and experience in Machine Learning using R or Python is required. You should also have expertise in MS Office Suite, NLP, Web Scraping, Log Analytics, TensorFlow, AI, and Beautiful Soup. To qualify for this role, you should hold a qualification in BE/B.Tech, MSC in Computer Science/Statistics, or M.C.A, along with 5-7 years of relevant experience. EY is looking for individuals who can work collaboratively across various client departments, offer practical solutions to complex problems, and possess agility, curiosity, mindfulness, and positive energy. EY offers a dynamic work environment with a focus on skills development, learning, and career progression. As a part of EY, you will have access to personalized career journeys and resources to enhance your roles, skills, and opportunities. EY is dedicated to being an inclusive employer, ensuring the well-being and career growth of its employees while delivering excellent client service. If you meet the criteria mentioned above and are ready to contribute to building a better working world, apply now to join EY and be a part of a team committed to making a positive impact.,
Posted 3 weeks ago
3.0 - 5.0 years
9 - 11 Lacs
Pune
Work from Office
Hiring Senior Data Engineer for an AI-native startup. Work on scalable data pipelines, LLM workflows, web scraping (Scrapy, lxml), Pandas, APIs, and Django. Strong in Python, data quality, mentoring, and large-scale systems. Health insurance
Posted 3 weeks ago
2.0 - 6.0 years
0 Lacs
haryana
On-site
As a Data Analyst with Market Research and Web Scraping skills at our company located in Udyog Vihar Phase-1, Gurgaon, you will be expected to leverage your 2-5 years of experience in data analysis, particularly in competitive analysis and market research within the Fashion/garment/apparel industry. A Bachelor's degree in Data Science, Computer Science, Statistics, Business Analytics, or related field is required, while advanced degrees or certifications in data analytics or market research are considered a plus. Your main responsibility will be to analyze large datasets to identify trends, patterns, and insights related to market trends and competitor performance. You will conduct quantitative and qualitative analyses to support decision-making in product development and strategy. Additionally, you will be involved in performing in-depth market research to track competitor performance, emerging trends, and customer preferences. Furthermore, you will design and implement data scraping solutions to gather competitor data from websites, ensuring compliance with legal standards and respect of website terms of service. Creating and maintaining organized databases with market and competitor data for easy access and retrieval will be part of your routine, along with collaborating closely with cross-functional teams to align data insights with company objectives. To excel in this role, you should have proven experience with data scraping tools such as BeautifulSoup, Scrapy, or Selenium, proficiency in SQL, Python, or R for data analysis and data manipulation, and experience with data visualization tools like Tableau, Power BI, or D3.js. Strong analytical skills and the ability to interpret data to draw insights and make strategic recommendations are essential. If you are passionate about data analysis, market research, and web scraping and possess the technical skills and analytical mindset required, we encourage you to apply by sending your updated resume with current salary details to jobs@glansolutions.com. For any inquiries, please contact Satish at 8802749743 or visit our website at www.glansolutions.com. Join us on this exciting journey of leveraging data to drive strategic decisions and make a meaningful impact in the Fashion/garment/apparel industry.,
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City