Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
0 Lacs
Jaipur, Rajasthan, India
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Proof of Concept (POC) Development: Develop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another Document solution architectures, design decisions, implementation details, and lessons learned. Stay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation Preferred Technical And Professional Experience Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation leveraging LLM capabilities would be a Big plus Demonstrate a growth mindset to understand clients' business processes and challenges
Posted 4 days ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
Remote
Req ID: 326519 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Windows Engineering - Systems Engineer to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Responsibilities Work with a global team of highly motivated platform engineers and software developers Participate in full platform product lifecycle for windows based solutions: analysis, technical design, testing, release, and support. Evaluate and test commercial solutions to help drive "Buy vs. Build" decisions Engineer and maintain automated Windows builds to support desktop on virtualized and cloud environments Execute performance testing of new desktop and server builds on various virtual and physical platforms Basic Qualifications In-depth knowledge in Windows OS internals including Group Policies, Windows Defender, Windows Networking stack, OS build engineering and deployment Experience engineering, managing and supporting environments involving VDI (Citrix, RDS), and Microsoft Desktop and Server platforms (Windows 10, Win11, Server 2016+) Experience in automation, anomaly detection and predictive analysis Experience with Windows Systems Management and Patch management Technologies (SCCM, MECM, Group Policy and WSUS) Extensive experience with one or more scripting technologies: PowerShell, VBS, SQL, Windows Batch Experience with Windows build engineering, particularly experience with SCCM and MDT Experience in supporting secure infrastructure solutions involving DMZ Environment Preferred Qualifications Enterprise-level Financial Services experience is preferred Team player who is able to work in a fast paced, environment Strong verbal and written communication skills Strong analytical and problem solving skills Experience with public cloud environments (AWS Certified Solutions Architect or equivalent) Strong communication skill and able to work with remote counterparts internally and manage external vendors Familiar with Linux, Middleware, Networks, Storage, Email, Mobile Applications, Voice/Video/Multimedia Systems and Market Data Systems About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 4 days ago
7.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
We are Lenovo. We do what we say. We own what we do. We WOW our customers. Lenovo is a US$57 billion revenue global technology powerhouse, ranked #248 in the Fortune Global 500, and serving millions of customers every day in 180 markets. Focused on a bold vision to deliver Smarter Technology for All, Lenovo has built on its success as the world’s largest PC company with a full-stack portfolio of AI-enabled, AI-ready, and AI-optimized devices (PCs, workstations, smartphones, tablets), infrastructure (server, storage, edge, high performance computing and software defined infrastructure), software, solutions, and services. Lenovo’s continued investment in world-changing innovation is building a more equitable, trustworthy, and smarter future for everyone, everywhere. Lenovo is listed on the Hong Kong stock exchange under Lenovo Group Limited (HKSE: 992) (ADR: LNVGY). This transformation together with Lenovo’s world-changing innovation is building a more inclusive, trustworthy, and smarter future for everyone, everywhere. To find out more visit www.lenovo.com, and read about the latest news via our StoryHub. Manage assigned accounts via Tele (inbound and outbound), email, web approach Understand the customer requirement and Recommend Lenovo Product Solutio Maintain customer journey in MS Dynamics Manage pipeline Take ownership of customer end to end sales cycle until fulfilment/Delivery Achieve set weekly, Monthly Quarterly Targets Ability to work with cross functional teams when needed Perform independently when engaging and closing identified sales opportunities within the customer base Manage a team of Inside Sales Specialist Education/ Qualification : Any Post Graduate (Full time) Work Experience : 7+ years experience post Full time graduation Consumer/Commercial Technology organization experience, desirable We are an Equal Opportunity Employer and do not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, national origin, status as a veteran, and basis of disability or any federal, state, or local protected class.
Posted 4 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our Company We’re Hitachi Vantara, the data foundation trusted by the world’s innovators. Our resilient, high-performance data infrastructure means that customers – from banks to theme parks – can focus on achieving the incredible with data. If you’ve seen the Las Vegas Sphere, you’ve seen just one example of how we empower businesses to automate, optimize, innovate – and wow their customers. Right now, we’re laying the foundation for our next wave of growth. We’re looking for people who love being part of a diverse, global team – and who get excited about making a real-world impact with data. What You Bring To The Team Role: Hitachi Storage Engineer Work Location: Hyderabad Working hours: Willingness to work in 24x7 & night shifts as required Job Description Extensive knowledge and working experience on the Hitachi Storage arrays which include HUS, HCI, VSP G Series, and knowledge on HNAS and HCP. Extensive knowledge and working experience on the Brocade switches. Knowledge on Hitachi tools like Tuning Manager, HDCA, Command Suite, HORCM, etc. Knowledge (preferable) working on Opramp and ServiceNow. Knowledge on Managing replication True Copy, HUR & HORCM. Administering and knowledge in using HCS & HDCA Experience in coordination, validation and planning for required firmware/patches / microcode upgrade activities Hands on working experience in managing the RAID Groups, Storage pools and Storage tiering. Knowledge on SAN fabric, Virtual Fabric (VSAN), FCIP & management and Host to SAN connectivity. Ownership of Infrastructure and responsible for service uptime / availability Maintain / Follow SOPs and Operational Run books, ITIL processes - Incident management, Problem management, Change management, Performance management, Capacity management, Configuration & availability management activities Participate in Change & release management. Participate in Disaster Recovery exercises. Experience in Plan & Implement routine & normal changes (Mostly below) Zoning and LUN Sharing LUN Allocation and Decommission Zero Page Reclamation Split & Swap the Pairs for various DR activities Hosts Groups Remediation Removing Zones HORCOM configuration changes Capacity management - LUN Expansions and capacity increases. Knowledge about using ticketing system for daily operations - ServiceNow Ability to deal with escalated and high severity incidents and work with various platform teams Participate on projects independently and deliver on time About Us We’re a global team of innovators. Together, we harness engineering excellence and passion for insight to co-create meaningful solutions to complex challenges. We turn organizations into datadriven leaders that can a make positive impact on their industries and society. If you believe that innovation can inspire the future, this is the place to fulfil your purpose and achieve your potential Championing diversity, equity, and inclusion Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team. How We Look After You We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success.
Posted 4 days ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Req ID: 327365 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Linux/UNIX Lead Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Preferred Experience: Thorough technical knowledge of Linux/UNIX/AIX platforms (Ubuntu, RHEL/SuSE). Must be able to demonstrate deep technical troubleshooting and problem-solving skills. Expertise in UNIX heterogeneous platform environments, including Clustering, Backup operations, Patching and Activation, Technical skills in Network components are preferred (DNS, DHCP, TCP). Good fundamentals and technical hands-on experience in Linux/UNIX/AIX troubleshooting. Willingness to develop strong competencies in Linux/UNIX platform support. Knowledge of Shell scripting / Python/Ansible will be a plus. Excellent understanding of ITSM\ITIL processes and certification will be added advantage. Good communication, team building, and interpersonal skill are mandatory. Knowledge and experience in Migration methodologies and Migration tools like Double Take and other related industry-wide deployment/migration tools. Knowledge in Linux VHS V2V - Vmotion and Solaris Zones migration from SHS to SHS is preferred, Solid understanding of cloud computing, networking, and storage principles with a focus on AWS. Azure experience is a plus. Cloud administration, OS/server administration, patching, maintenance, and troubleshooting experience. Willing to work on multiple cloud platforms. Shell scripting and automation experience is a plus. Fundamental understanding of emerging technologies, including Containers and PaaS solutions Able to work on On-Call rotations Able to work independently in a project scenario. Experience handling P1/P2 calls and escalations Able to write quality KB articles, Problem Management articles, and SOPs/Runbooks Passion for delivering timely and outstanding customer service Able to Help L1/L2 staff in support Great written and oral communication skills with internal and external customers Basic Qualifications: 12+ years of overall operational experience 5+ years of experience working in a diverse cloud support environment in a 24*7 production support model 2+ years Scripting/APIs experience Preferred Certifications: Red Hat System Administration AWS SysOps Administrator/Developer/Solutions Architect Associate Certifications Azure Administrator Associate/Solution Architect Certifications GCP Cloud Enginee About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 4 days ago
4.0 years
0 Lacs
Greater Lucknow Area
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
4.0 years
0 Lacs
Nashik, Maharashtra, India
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
4.0 years
0 Lacs
Thane, Maharashtra, India
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
4.0 years
0 Lacs
Kanpur, Uttar Pradesh, India
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
4.0 years
0 Lacs
Nagpur, Maharashtra, India
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
3.0 - 6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Introduction: A Career at HARMAN Automotive We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN Automotive, we give you the keys to fast-track your career. Engineer audio systems and integrated technology platforms that augment the driving experience Combine ingenuity, in-depth research, and a spirit of collaboration with design and engineering excellence Advance in-vehicle infotainment, safety, efficiency, and enjoyment About The Role We're seeking an experienced Cloud Platform and Data Engineering Specialist with expertise in GCP (Google Cloud Platform ) or Azure to join our team. The ideal candidate will have a strong background in cloud computing, data engineering, and DevOps. What You Will Do Cloud Platform Management: Manage and optimize cloud infrastructure (GCP), ensuring scalability, security, and performance. Data Engineering: Design and implement data pipelines, data warehousing, and data processing solutions. Kubernetes and GKE: Develop and deploy applications using Kubernetes and Google Kubernetes Engine (GKE). Python Development: Develop and maintain scripts and applications using Python. What You Need To Be Successful Experience: 3-6 years of experience in cloud computing, data engineering, and DevOps. Technical Skills: Strong understanding of GCP (Google Cloud Platform) or Azure. Experience with Kubernetes and GKE. Proficiency in Python programming language (8/10). Basic understanding of data engineering and DevOps practices. Soft Skills: Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Bonus Points if You Have GCP: Experience with GCP services, including Compute Engine, Storage, and BigQuery. Data Engineering: Experience with data engineering tools, such as Apache Beam, Dataflow, or BigQuery. DevOps: Experience with DevOps tools, such as Jenkins, GitLab CI/CD, or Cloud Build. What Makes You Eligible GCP Expertise: Strong expertise in GCP is preferred, but Azure experience will be considered in worst-case scenarios. Python Proficiency: Proficiency in Python programming language is essential. Kubernetes and GKE: Experience with Kubernetes and GKE is required. What We Offer Competitive salary and benefits package Opportunities for professional growth and development Collaborative and dynamic work environment Access to cutting-edge technologies and tools Recognition and rewards for outstanding performance through BeBrilliant Chance to work with a renowned German OEM You are expected to work all 5 days in a week You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today!
Posted 4 days ago
5.0 - 12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Experience Range-5-12 Years Location-Mumbai/Pune/Bangalore/Hyderabad/Chennai Job Description: Candidate must be experienced working in projects involving Other ideal qualifications include experiences in Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python Familiarity with AWS compute storage and IAM concepts Experience in working with S3 Data Lake as the storage tier Any ETL background Talend AWS Glue etc. is a plus but not required Cloud Warehouse experience Snowflake etc. is a huge plus Carefully evaluates alternative risks and solutions before taking action. Optimizes the use of all available resources Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit.
Posted 4 days ago
4.0 years
0 Lacs
Kochi, Kerala, India
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Company Description MYT4 Systems, an industrial technical software solution provider headquartered in Kochi, Kerala, offers comprehensive solutions for design, product development, digital mock-up, realistic simulations, and virtual product experiences through state-of-the-art software solutions. Our portfolio includes industry-leading tools such as 3DExperience, SolidWorks, Hexagon and Grabert, Adobe, Microsoft etc. We also provide a wide range of hardware and security solutions and services, including servers, laptops, desktops, printers, plotters, scanners, firewalls, endpoint security and antivirus, operating systems, backup, and storage solutions. We also offer cutting-edge solutions for 3D printing, 3D scanning, and virtual reality. Role Description This is a full-time on-site role as a Senior Sales Engineer located in Kochi. Drive Sales for Software solutions like SolidWorks, Hexagon, Coohom, Adobe, Microsoft etc Identify and qualify leads in target industries Prepare and deliver proposals and quote Work closely with pre-sales teams Achieve and exceed quarterly sales targets. Build long-term relationships customers Qualifications Sales Engineering and Sales skills Good Communication skills Strong problem-solving abilities Experience in the CAD/CAE industry is a plus Bachelor's degree in engineering or related field
Posted 4 days ago
4.0 years
0 Lacs
Greater Bhopal Area
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
6.0 - 9.5 years
0 Lacs
Andhra Pradesh, India
On-site
At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Position Title : Full Stack Lead Developer Experience : 6-9.5 Years Job Overview We are seeking a highly skilled and versatile polyglot Full Stack Developer with expertise in modern front-end and back-end technologies, cloud-based solutions, AI/ML and Gen AI. The ideal candidate will have a strong foundation in full-stack development, cloud platforms (preferably Azure), and hands-on experience in Gen AI, AI and machine learning technologies. Key Responsibilities Develop and maintain web applications using Angular/React.js, .NET, and Python. Design, deploy, and optimize Azure native PaaS and SaaS services, including but not limited to Function Apps, Service Bus, Storage Accounts, SQL Databases, Key vaults, ADF, Data Bricks and REST APIs with Open API specifications. Implement security best practices for data in transit and rest. Authentication best practices – SSO, OAuth 2.0 and Auth0. Utilize Python for developing data processing and advanced AI/ML models using libraries like pandas, NumPy, scikit-learn and Langchain, Llamaindex, Azure OpenAI SDK Leverage Agentic frameworks like Crew AI, Autogen etc. Well versed with RAG and Agentic Architecture. Strong in Design patterns – Architectural, Data, Object oriented Leverage azure serverless components to build highly scalable and efficient solutions. Create, integrate, and manage workflows using Power Platform, including Power Automate, Power Pages, and SharePoint. Apply expertise in machine learning, deep learning, and Generative AI to solve complex problems. Primary Skills Proficiency in React.js, .NET, and Python. Strong knowledge of Azure Cloud Services, including serverless architectures and data security. Experience with Python Data Analytics libraries: pandas NumPy scikit-learn Matplotlib Seaborn Experience with Python Generative AI Frameworks: Langchain LlamaIndex Crew AI AutoGen Familiarity with REST API design, Swagger documentation, and authentication best practices. Secondary Skills Experience with Power Platform tools such as Power Automate, Power Pages, and SharePoint integration. Knowledge of Power BI for data visualization (preferred). Preferred Knowledge Areas – Nice To Have In-depth understanding of Machine Learning, deep learning, supervised, un-supervised algorithms. Qualifications Bachelor's or master's degree in computer science, Engineering, or a related field. 6~12 years of hands-on experience in full-stack development and cloud-based solutions. Strong problem-solving skills and ability to design scalable, maintainable solutions. Excellent communication and collaboration skills.
Posted 4 days ago
4.0 years
0 Lacs
Visakhapatnam, Andhra Pradesh, India
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
4.0 - 9.0 years
15 - 22 Lacs
Noida, Pune, Bengaluru
Work from Office
Job-Description: Candidate must have 4 to 9 years of experience in Perl programming. Strong Knowledge of Perl Programming, Object Oriented Perl, Debugger, Regular Expressions, File Handling, DBI, XMLs Rest APIs etc. Good to have experience in Storage and Virtualization domain. (VMWare) Good understanding of TCP/IP, Networks, Operating system concepts Knowledge of Linux and Databases is good Knowledge of DevOps concepts Docker, Containers, Terraform and Jenkins. Having Knowledge of Python is good. Aware of Test Automation - Test frameworks Experience in storage domain. Knowledge of virtualization, VMWare concepts. Good analytical skills, problem solving ability. Self Driven and positive attitude towards learning. Good Leadership qualities - Ability to drive the work in Team and good Team Player
Posted 4 days ago
4.0 years
0 Lacs
Indore, Madhya Pradesh, India
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
4.0 years
0 Lacs
Surat, Gujarat, India
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
4.0 years
0 Lacs
Chandigarh, India
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
4.0 years
0 Lacs
Dehradun, Uttarakhand, India
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
4.0 years
0 Lacs
Mysore, Karnataka, India
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
4.0 years
0 Lacs
Vijayawada, Andhra Pradesh, India
Remote
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The storage job market in India is seeing significant growth as more and more companies are investing in data storage solutions to manage their increasing volumes of data. From cloud storage to data centers, there is a high demand for skilled professionals in this field. If you are looking to explore storage jobs in India, here is a guide to help you navigate the job market.
These cities are known for their strong IT infrastructure and have a high concentration of companies that require storage professionals.
The average salary range for storage professionals in India varies based on experience level. Entry-level positions can expect to earn between INR 3-6 lakhs per annum, while experienced professionals can earn upwards of INR 10 lakhs per annum.
A typical career progression in the storage field may include roles such as Storage Administrator, Storage Engineer, Storage Architect, and eventually Storage Manager. With each role, professionals gain more experience and responsibility in designing and managing storage solutions.
In addition to storage expertise, professionals in this field are often expected to have skills in networking, virtualization, and data security. Knowledge of cloud storage solutions and storage technologies like SAN and NAS can also be beneficial.
As you prepare for storage job interviews in India, remember to showcase your expertise in storage technologies and related skills. With the right preparation and confidence, you can land a rewarding career in this dynamic field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17069 Jobs | Dublin
Wipro
9221 Jobs | Bengaluru
EY
7581 Jobs | London
Amazon
5941 Jobs | Seattle,WA
Uplers
5895 Jobs | Ahmedabad
Accenture in India
5813 Jobs | Dublin 2
Oracle
5703 Jobs | Redwood City
IBM
5669 Jobs | Armonk
Capgemini
3478 Jobs | Paris,France
Tata Consultancy Services
3259 Jobs | Thane