Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required.
Posted 1 day ago
9.0 years
0 Lacs
India
On-site
We’re looking for a hands-on AI/ML engineer to design, build, and deploy Generative AI and LLM solutions at scale. What You’ll Do: Design and implement ML and GenAI models to solve real-world problems. Build RAG pipelines using enterprise data and LLMs (OpenAI, HuggingFace, etc.). Integrate LLM-based chatbots with enterprise systems and domain data. Deploy solutions using Azure ML, Azure OpenAI, Azure Functions, etc. Fine-tune LLMs for domain-specific tasks and performance. Collaborate with DS, MLOps, and engineering teams for CI/CD and operationalization. Stay current with GenAI innovations and apply them to platform evolution. Skills We’re Looking For: 4–9 years of experience in AI/ML/Data Science. Strong Python with PyTorch, TensorFlow, HuggingFace, LangChain. Cloud: Azure (preferred), AWS or GCP also considered. Experience with Azure ML, Azure Cognitive Services, Azure OpenAI. Familiarity with RAG architectures and vector DBs (e.g., Pinecone, Weaviate, Azure AI Search). Bonus: Databricks, Spark, Docker/K8s, prompt engineering, chatbot frameworks. Domain Experience: BFSI (banking, fintech, insurance) preferred. Working model: Hybrid Work Locations: Bengaluru, Chennai, Hyderabad, Pune, Coimbatore, Gurugram, Mumbai Ready to build the future with GenAI? Apply now!
Posted 1 day ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
- - - - - - - - - - - - Job Summary: The Digital Business Analyst will be a key actor for the creation and implementation of a new Business Intelligence (BI) system that will enhance the visibility of sales, forecasts, market share details across various departments, including Finance, Marketing, Sales, and other areas. This role involves working closely with business stakeholders to understand their needs, developing and configuring BI solutions, and ensuring the effective use of data for decision-making. Key Responsabilités Collaborate with cross-functional teams and business stake holders to gather, analyze and challenge business requirements for BI solutions. Work with IT and Data Analyst teams to develop, configure, and model BI solutions. Ensure the BI system is scalable and capable of handling a large number of users effectively. Conduct data validation and quality assurance to ensure the accuracy and reliability of BI outputs. Continuously monitor and optimize the BI system for performance and usability. Participate in the maintenance and evolution of the BI solution in support of the Product Owner. Coordinate with Data Owners, Business Data Stewards, and other stakeholders to align the BI product with the company's data strategy. Provide training and support to end-users to maximize the utilization of the BI system and effective data-driven decision-making. Assist and animate the BI Champions and support requests for the development or enhancement of BI solutions Provide end-to-end data and business expertise to ensure high-quality processes and outcomes. Occasionally handle data visualization tasks related to value creation. Supervise and guide junior business analysts as needed. Key Achievements Expected Ensure the build, deployment and evolution of a new BI according the project plan : Successfully design, implement, and maintain a comprehensive BI solution that delivers key business indicators linked to sales, forecasts, market share, and other critical metrics. Assist the Product Owner in planning and prioritizing developments and ensuring the availability of new data and functionalities. Develop/Design BI reports and dashboards : Create or design detailed reports and dashboards to track important metrics and provide valuable insights to answer the business needs. Write the user stories required. Collaborate with cross-functional teams and Business Stakeholders : Work closely with various departments to gather and analyze business requirements for BI solutions. User Training: Train users on data utilization and visualization to ensure effective data-driven decision-making. Ensure scalability and user management : Make sure the BI system is scalable and capable of handling a large user base effectively Conduct data validation and quality assurance : Ensure the accuracy and reliability of BI outputs through rigorous data validation and quality assurance processes. Optimize system performance : Continuously monitor and optimize the BI system for performance and usability. And Ensure data availability and functionality within the required timelines and quality standards. Ensure the lifecycle management of BI solutions, including collecting evolution needs, improvements, and problem resolution. Support BI Champions: Assist BI Champions and support requests for the development or enhancement of BI solutions. Technologies Power BI Power App Power Automate Sharepoint SQL, Azure Databricks (Pyspark) Good to have
Posted 1 day ago
10.0 years
0 Lacs
Bhilai, Chhattisgarh, India
On-site
Job Description We are seeking a highly skilled and experienced Senior ETL Engineer with strong expertise in Azure, Azure Data Factory, and Apache Spark. The ideal candidate will play a key role in designing, developing, and optimizing scalable ETL pipelines and data integration workflows that power enterprise-level data platforms. Roles & Responsibilities Roles and Responsibilities Design, develop, and maintain robust ETL processes using Azure Data Factory, Apache Spark, and other relevant tools. Work closely with data architects and business analysts to understand data requirements and translate them into technical solutions. Optimize data transformation workflows for performance and scalability. Build and support data pipelines for ingesting, processing, and delivering structured and unstructured data from various sources. Implement best practices for data governance, security, and monitoring across Azure services. Ensure quality and integrity of data through validations and error handling. Collaborate across teams to support data warehousing and analytics initiatives. Skills Qualifications and Skills 7–10 years of experience in ETL development and data integration. Strong hands-on experience with Azure Data Factory and other Azure data services. Deep understanding of Apache Spark and distributed data processing. Experience with SQL and performance tuning. Familiarity with data modeling concepts, data lakes, and data warehousing. Solid understanding of CI/CD pipelines and version control tools. Knowledge of Azure-based security, storage, and networking is a plus. Excellent problem-solving skills and ability to work in a fast-paced environment. Preferred Qualifications Microsoft Certified: Azure Data Engineer Associate Experience working in Agile/Scrum methodologies Exposure to tools like Databricks, Synapse Analytics, or Snowflake is a plus Experience 8-10 Years
Posted 1 day ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
General Skills & Experience: Minimum 10-18 yrs of Experience • Expertise in Spark (Scala/Python), Kafka, and cloud-native big data services (GCP, AWS, Azure) for ETL, batch, and stream processing. • Deep knowledge of cloud platforms (AWS, Azure, GCP), including certification (preferred). • Experience designing and managing advanced data warehousing and lakehouse architectures (e.g., Snowflake, Databricks, Delta Lake, BigQuery, Redshift, Synapse). • Proven experience with building, managing, and optimizing ETL/ELT pipelines and data workflows for large-scale systems. • Strong experience with data lakes, storage formats (Parquet, ORC, Delta, Iceberg), and data movement strategies (cloud and hybrid). • Advanced knowledge of data modeling, SQL development, data partitioning, optimization, and database administration. • Solid understanding and experience with Master Data Management (MDM) solutions and reference data frameworks. • Proficient in implementing Data Lineage, Data Cataloging, and Data Governance solutions (e.g., AWS Glue Data Catalog, Azure Purview). • Familiar with data privacy, data security, compliance regulations (GDPR, CCPA, HIPAA, etc.), and best practices for enterprise data protection. • Experience with data integration tools and technologies (e.g. AWS Glue, GCP Dataflow , Apache Nifi/Airflow, etc.). • Expertise in batch and real-time data processing architectures; familiarity with event-driven, microservices, and message-driven patterns. • Hands-on experience in Data Analytics, BI & visualization tools (PowerBI, Tableau, Looker, Qlik, etc.) and supporting complex reporting use-cases. • Demonstrated capability with data modernization projects: migrations from legacy/on-prem systems to cloud-native architectures. • Experience with data quality frameworks, monitoring, and observability (data validation, metrics, lineage, health checks). • Background in working with structured, semi-structured, unstructured, temporal, and time series data at large scale. • Familiarity with Data Science and ML pipeline integration (DevOps/MLOps, model monitoring, and deployment practices). • Experience defining and managing enterprise metadata strategies.
Posted 1 day ago
0 years
0 Lacs
India
On-site
About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Visit us at nttdata.com Job Description for Data Engineer Experience with the following key deployment automations must be there: 5+ yeas of experience in Azure Data Factory (ADF) Azure SQL PySpark Delta Lake –(Databrics) Databricks deployment pipelines Automation of ADF deployments Automation of database deployments (Azure SQL, Delta Lake) Automation of Databricks deployments Deployment of Python, Django, and React-based microservices to Azure services such as Function Apps, Container Apps, Web Apps, and Azure Kubernetes Service (AKS) Job Mode:- Hybrid (2 days in a week) Job Location:- Any NTT Data office Note:- Preferred only candidates who can join us within 30 days time frame. #NTTData #LI-NorthAmerica
Posted 1 day ago
6.0 - 8.0 years
0 Lacs
Greater Chennai Area
On-site
About BNP Paribas India Solutions Established in 2005, BNP Paribas India Solutions is a wholly owned subsidiary of BNP Paribas SA, European Union’s leading bank with an international reach. With delivery centers located in Bengaluru, Chennai and Mumbai, we are a 24x7 global delivery center. India Solutions services three business lines: Corporate and Institutional Banking, Investment Solutions and Retail Banking for BNP Paribas across the Group. Driving innovation and growth, we are harnessing the potential of over 10000 employees, to provide support and develop best-in-class solutions. About BNP Paribas Group BNP Paribas is the European Union’s leading bank and key player in international banking. It operates in 65 countries and has nearly 185,000 employees, including more than 145,000 in Europe. The Group has key positions in its three main fields of activity: Commercial, Personal Banking & Services for the Group’s commercial & personal banking and several specialised businesses including BNP Paribas Personal Finance and Arval; Investment & Protection Services for savings, investment, and protection solutions; and Corporate & Institutional Banking, focused on corporate and institutional clients. Based on its strong diversified and integrated model, the Group helps all its clients (individuals, community associations, entrepreneurs, SMEs, corporates and institutional clients) to realize their projects through solutions spanning financing, investment, savings and protection insurance. In Europe, BNP Paribas has four domestic markets: Belgium, France, Italy, and Luxembourg. The Group is rolling out its integrated commercial & personal banking model across several Mediterranean countries, Turkey, and Eastern Europe. As a key player in international banking, the Group has leading platforms and business lines in Europe, a strong presence in the Americas as well as a solid and fast-growing business in Asia-Pacific. BNP Paribas has implemented a Corporate Social Responsibility approach in all its activities, enabling it to contribute to the construction of a sustainable future, while ensuring the Group's performance and stability Commitment to Diversity and Inclusion At BNP Paribas, we passionately embrace diversity and are committed to fostering an inclusive workplace where all employees are valued, respected and can bring their authentic selves to work. We prohibit Discrimination and Harassment of any kind and our policies promote equal employment opportunity for all employees and applicants, irrespective of, but not limited to their gender, gender identity, sex, sexual orientation, ethnicity, race, colour, national origin, age, religion, social status, mental or physical disabilities, veteran status etc. As a global Bank, we truly believe that inclusion and diversity of our teams is key to our success in serving our clients and the communities we operate in. About Business Line/Function The Intermediate Holding Company (“IHC”) program structured at the U.S. level across poles of activities of BNP Paribas provides guidance, supports the analysis, impact assessment and drives adjustments of the U.S. platform’s operating model due to the drastic changes introduced by the Enhanced Prudential Standards (“EPS”) for Foreign Banking Organizations (“FBOs”) finalized by the Federal Reserve in February 2014, implementing Section 165 of U.S. Dodd-Frank Act. The IT Transversal Team is part of the Information Technology Group which works simultaneously on a wide range of projects arising from business, strategic initiatives, and regulatory changes and reengineering of existing applications to improve functionality and efficiency. Job Title Python Developer Date June-25 Department ITG- Fresh Location: Chennai, Mumbai Business Line / Function Finance Dedicated Solutions Reports To (Direct) Grade (if applicable) (Functional) Number Of Direct Reports NA Directorship / Registration NA Position Purpose The Python Developer will play a critical role in building and maintaining financial applications and tools that support data processing, analysis, and reporting within a fast-paced financial services environment. This position involves developing scalable and secure systems. The developer will collaborate with business analysts, finance users/or finance BA to translate complex business requirements into efficient, high-quality software solutions. A strong understanding of financial concepts, data integrity, and regulatory compliance is essential. The detailed responsibilities are mentioned below. Responsibilities Direct Responsibilities Proficient in object-oriented programming, especially Python, with a minimum of 6-8 years of core python development experience. Strong competency with Python libraries such as Pandas and NumPy for data wrangling, analysis, and manipulation. Expertise in PySpark for large-scale data processing and loading into databases. Proficiency in data querying and manipulation with Oracle and PostgreSQL. Strong communication skills to effectively collaborate with team members and stakeholders. Familiarity with the Software Development Life Cycle (SDLC) process and its various stages, including experience with JIRA and Confluence. Technical & Behavioral Competencies Proficient in object-oriented programming, especially Python, with a minimum of 6-8 years of core python development experience. Strong competency with Python libraries such as Pandas and NumPy for data wrangling, analysis, and manipulation. Expertise in PySpark for large-scale data processing and loading into databases. Proficiency in data querying and manipulation with Oracle and PostgreSQL. Strong communication skills to effectively collaborate with team members and stakeholders. Familiarity with the Software Development Life Cycle (SDLC) process and its various stages, including experience with JIRA and Confluence. Good analytical, problem solving, & communication skills Engage in technical discussions and to help in improving the system, process etc Nice to Have Familiarity with Plotly and Matplotlib for data visualization of large datasets. Skilled in API programming, handling JSON, CSV, and other unstructured data from various systems. Familiarity with JavaScript, CSS, and HTML. Experience with cloud architecture applications such as Dataiku or Databricks; competency with ETL tools. Knowledge of regulatory frameworks, RISK, CCAR, and GDPR. Skills Referential Specific Qualifications (if required) Behavioural Skills: (Please select up to 4 skills) Ability to collaborate / Teamwork Critical thinking Ability to deliver / Results driven Communication skills - oral & written Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to develop and adapt a process Ability to understand, explain and support change Ability To Develop Others & Improve Their Skills Choose an item. Education Level Bachelor Degree or equivalent Experience Level At least 5 years
Posted 1 day ago
100.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our client is a global technology company headquartered in Santa Clara, California. it focuses on helping organizations harness the power of data to drive digital transformation, enhance operational efficiency, and achieve sustainability. over 100 years of experience in operational technology (OT) and more than 60 years in IT to unlock the power of data from your business, your people and your machines. We help enterprises store, enrich, activate and monetise their data to improve their customers’ experiences, develop new revenue streams and lower their business costs. Over 80% of the Fortune 100 trust our client for data solutions. The company’s consolidated revenues for fiscal 2024 (ended March 31, 2024). approximately $57.5 billion USD., and the company has approximately 296,000 employees worldwide. It delivers digital solutions utilizing Lumada in five sectors, including Mobility, Smart Life, Industry, Energy and IT, to increase our customers’ social, environmental and economic value . Job Title: Quantexa certified engineer Location: PAN INDIA Experience: 9 to 12 Years Job Type: Contract to hire. Notice Period: Immediate joiners . Mandatory Skills: Quantexa certification, Scala, Spark, Azure databricks Job Summary: ROLE PURPOSE The purpose of the Data Engineer is to design, build and unit test data pipelines and jobs for Projects and Programmes on Azure Platform. This role is purposed for Quantexa Fraud platform programme, Quantexa certified engineer is a preferred. Analyse business requirements and support and maintain Quantexa platform. Build and deploy new/changes to data mappings, sessions, and workflows in Azure Cloud Platform – key focus area would be Quantexa platform on Azure. Develop both batch (using Azure Databricks) and real time (Kafka and Kubernetes) pipelines and jobs to extract, transform and load data to platform. Perform ETL routines performance tuning, troubleshooting, support, and capacity estimation. Conduct thorough testing of ETL code changes to ensure quality deliverables Provide day-to-day support and mentoring to end users who are interacting with the data Profile and understand large amounts of source data available, including structured and semi-structured/web activity data Analyse defects and provide fixes Provide release notes for deployments Support Release activities Problem solving attitude Keep up to date with new skills - Develop technology skills in other areas of Platform Exposure to Fraud, financial crime, customer insights or compliance-based projects that utilize detection and prediction models Experienced in ETL tools like databricks (Spark) and data projects Experience with Kubernetes to deliver real time data ingestion and transformation using scala. Scala knowledge would be highly desirable, Python knowledge is a plus Strong knowledge of SQL Strong Analytical skills Azure DevOps knowledge Experience with local IDE, design documentations, Unit testing
Posted 1 day ago
0 years
0 Lacs
India
On-site
Job Title : Quality Assurance Automation Engineer Job Type : Full-time, Contractor Location : Hybrid - Hyderabad | Pune| Delhi About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Job Summary: We are seeking a detail-oriented and innovative Quality Assurance Automation Engineer to join our customer's team. In this critical role, you will design, develop, and execute automated tests to ensure the quality, reliability, and integrity of data within Databricks environments. If you are passionate about data quality, thrive in collaborative environments, and excel at both written and verbal communication, we'd love to meet you. Key Responsibilities: Design, develop, and maintain robust automated test scripts using Python, Selenium, and SQL to validate data integrity within Databricks environments. Execute comprehensive data validation and verification activities to ensure accuracy and consistency across multiple systems, data warehouses, and data lakes. Create detailed and effective test plans and test cases based on technical requirements and business specifications. Integrate automated tests with CI/CD pipelines to facilitate seamless and efficient testing and deployment processes. Work collaboratively with data engineers, developers, and other stakeholders to gather data requirements and achieve comprehensive test coverage. Document test cases, results, and identified defects; communicate findings clearly to the team. Conduct performance testing to ensure data processing and retrieval meet established benchmarks. Provide mentorship and guidance to junior team members, promoting best practices in test automation and data validation. Required Skills and Qualifications: Strong proficiency in Python, Selenium, and SQL for developing test automation solutions. Hands-on experience with Databricks, data warehouse, and data lake architectures. Proven expertise in automated testing of data pipelines, preferably with tools such as Apache Airflow, dbt Test, or similar. Proficient in integrating automated tests within CI/CD pipelines on cloud platforms (AWS, Azure preferred). Excellent written and verbal communication skills with the ability to translate technical concepts to diverse audiences. Bachelor’s degree in Computer Science, Information Technology, or a related discipline. Demonstrated problem-solving skills and a collaborative approach to teamwork. Preferred Qualifications: Experience with implementing security and data protection measures in data-driven applications. Ability to integrate user-facing elements with server-side logic for seamless data experiences. Demonstrated passion for continuous improvement in test automation processes, tools, and methodologies.
Posted 1 day ago
0.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Schneider AI HUB team is a global organization led by the Chief AI officer. AI Hub organization has a mandate to spearhead the AI-driven digital transformation of Schneider Electric. We’re looking for a top-notch Software Engineer who always sweats the small stuff and cares about impeccable code. If you always strive towards a sustainable, scalable and secure code for every single application you develop, while keeping in mind the customer needs, then you are what we are looking for. You’ll get the chance to work with experienced engineers across our enterprise with a chance to move across varying automation technologies in the future. As Python developers in AI team, you will have the opportunity to work on multiple complex assignments like physics based complex models, lambda functions, automation pipelines, object-oriented programming, etc. Responsibilities Analyze and translate business requirements into scalable and resilient design. Own parts of the application and continuously improve them in an agile environment. Create high quality maintainable products and applications using best engineering practices. Pair with other developers and share design philosophy and goals across the team. Work in cross functional teams (DevOps, Data, UX, Testing etc.). Build and manage fully automated build/test/deployment environments. Ensure high availability and provide quick turnaround to production issues. Contribute to the design of useful, usable, and desirable products in a team environment. Adapt to new programming languages, methodologies, platforms, and frameworks to support the business needs. Qualifications Requirements and Skills You should be an engineering graduate preferably from a computer science background or with strong computer science fundamentals. 2 to 5 years’ experience required Knowledge and experience on following technologies Languages: Python , Matlab, R, Object-Oriented languages (Java, C++, C#, JavaScript); Cloud platforms: Microsoft Azure (mandatory) Tools: Kubernetes (AKS), Databricks, Docker, Spark, OpenDataSoft, Git. Proficient in OOO design is critical. Experience with Computer Science fundamentals in data structures, algorithms, and complexity analysis Good Analytical skills. Strong verbal & written communication: should be able to articulate concisely & clearly. About Us Schneider Electric™ creates connected technologies that reshape industries, transform cities and enrich lives. Our 144,000 employees thrive in more than 100 countries. From the simplest of switches to complex operational systems, our technology, software and services improve the way our customers manage and automate their operations. Help us deliver solutions that ensure Great people make Schneider Electric a great company. We seek out and reward people for putting the customer first, being disruptive to the status quo, embracing different perspectives, continuously learning, and acting like owners. We want our employees to reflect the diversity of the communities in which we operate. We welcome people as they are, creating an inclusive culture where all forms of diversity are seen as a real value for the company. We’re looking for people with a passion for success — on the job and beyond. Primary Location : IN-Karnataka-Bangalore Schedule : Full-time Unposting Date : Ongoing
Posted 1 day ago
0.0 years
0 Lacs
Hyderabad, Telangana
On-site
BI Specialist II Hyderabad, India; Ahmedabad, India Information Technology 312656 Job Description About The Role: Grade Level (for internal use): 09 The Team: Are you ready to dive into the world of data and uncover insights that shape global commodity markets? We're looking for a passionate BI Developer to join our Business Intelligence team within the Commodity Insights division at S&P Global. At S&P Global, we are on a mission to harness the power of data to unlock insights that propel our business forward. We believe in innovation, collaboration, and the relentless pursuit of excellence. Join our dynamic team and be a part of a culture that celebrates creativity and encourages you to push the boundaries of what’s possible. Key Responsibilities Unlocking the Power of Data Collaborate on the end-to-end data journey, helping collect, cleanse, and transform diverse data sources into actionable insights that shape business strategies for functional leaders. Work alongside senior BI professionals to build powerful ETL processes, ensuring data quality, consistency, and accessibility. Crafting Visual Storytelling Develop eye-catching, impactful dashboards and reports that tell the story of commodity trends, prices, and global market dynamics. Bring data to life for stakeholders across the company, including executive teams, analysts, and developers, by helping to create visually compelling and interactive reporting tools. Mentor and train users on dashboard usage for efficient utilization of insights. Becoming a Data Detective Dive deep into commodities data to uncover trends, patterns, and hidden insights that influence critical decisions in real-time. Demonstrate strong analytical skills to swiftly grasp business needs and translate them into actionable insights. Collaborate with stakeholders to define key metrics and KPIs and contribute to data-driven decisions that impact the organization’s direction. Engaging with Strategic Minds Work together with cross-functional teams within business operations to turn complex business challenges into innovative data solutions. Gather, refine, and translate business requirements into insightful reports and dashboards that push our BI team to new heights. Provide ongoing support to cross-functional teams, addressing issues and adapting to changing business processes. Basic Qualifications : 3+ years of professional experience in BI projects, focusing on dashboard development using Power BI or similar tools and deploying them on their respective online platforms for easy access. Proficiency in working with various databases such as Redshift, Oracle, and Databricks, using SQL for data manipulation, and implementing ETL processes for BI dashboards . Ability to identify meaningful patterns and trends in data to provide valuable insights for business decision-making. Skilled in requirement gathering and developing BI solutions. Candidates with a strong background/proficiency in Power BI and Power Platforms tools such as Power Automate/Apps, and intermediate to advanced proficiency in Python are preferred. Essential understanding of data modeling techniques tailored to problem statements. Familiarity with cloud platforms (e.g., Azure, AWS) and data warehousing. Exposure to GenAI concepts and tools such as ChatGPT. Experience with to Agile Project Implementation methods. Excellent written and verbal communication skills. Must be able to self-start and succeed in a fast-paced environment. Additional/Preferred Qualifications : Knowledge of Generative AI, Microsoft Copilot, and Microsoft Fabric a plus. Ability to write complex SQL queries or enhance the performance of existing ETL pipelines is a must. Familiarity with Azure Devops will be an added advantage. Candidates with a strong background/proficiency in Power BI and Power Platforms tools such as Power Automate/Apps, and intermediate to advanced proficiency in Python are preferred. Shift Timings:- 1PM-10PM IST (Flexibility Required) About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. We’re a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights’ coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSE: SPGI). S&P Global is the world’s foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world’s leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit http://www.spglobal.com/commodity-insights. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 312656 Posted On: 2025-06-26 Location: Hyderabad, Telangana, India
Posted 1 day ago
3.0 years
0 Lacs
Hyderabad, Telangana
Remote
Software Engineer II Hyderabad, Telangana, India Date posted Jun 26, 2025 Job number 1837285 Work site Up to 50% work from home Travel 25-50 % Role type Individual Contributor Profession Software Engineering Discipline Software Engineering Employment type Full-Time Overview Are you passionate about working on cutting-edge devices? Surface Team is dedicated to building powerful devices that empower individuals and organizations. We’re working on the next generation of Surface products, and we need talented individuals like you! We are seeking a Full Stack Software Engineer II AI to architect and scale the Surface team’s internal tools and systems. The ideal candidate is organized, adaptable, and capable of managing multiple responsibilities across system design, development, and maintenance. This role involves leading the design of secure and scalable systems architecture, overseeing services development, implementing best practices to enhance functionality, and driving high-priority development efforts. Additionally, the engineer will play a key role in integrating AI solutions and automation workflows to boost team productivity, all within a secure and compliant infrastructure. In alignment with our Microsoft values, we are committed to cultivating an inclusive work environment for all employees to positively impact our culture every day. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Required Qualifications: Bachelor's Degree in Computer Science or related technical field AND 4+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience. Proficiency in one or more programming languages: C#, Java, JavaScript, TypeScript, Rust, Python. Experience with modern web technologies like .NET, Node, React, Angular, building RESTful APIs, and familiarity with web concepts such as HTTP and MVC. 3+ years of experience with cloud platforms like Azure Synapse and Azure ML Studio, focusing on AI model development and deployment. 1+ years of experience with Agentic AI & Language Learning. Familiarity with AI tools and creating/utilizing AI agents to enhance team productivity. Strong collaboration experience in designing, testing, and shipping solutions to large technical problems. Ability to design and develop expandable, componentized software meeting requirements on time. Capability to handle ambiguity, understand key business needs, and apply appropriate technology solutions. Excellent analytical, problem-solving, debugging skills, and solid understanding of object-oriented design, coding patterns, and testing practices. Experience in agile, DevOps, microservices, and mobile is a plus. Other Requirements: Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Preferred Qualifications: Excellent written and verbal communication skills. Experience in developing Monitoring & Telemetry tools. Experiences with building dashboards, code analysis , secure practices. #W+Djobs Responsibilities Collaborate with Engineers, Product Managers, Technical Program Managers, Designers, and Partners to deliver features designed with the appropriate architecture. Demonstrate sound business acumen by working with business partners to understand data, define problems, and solve primary business objectives. Understand data pipelines, process flows, and reports. Acquire the necessary data for successful project completion. Proactively identify changes and communicate them to senior leadership. Utilize artificial intelligence to enhance productivity. · Leverage expertise in machine learning solutions (e.g., classification, regression, clustering, forecasting, natural language processing, image recognition) and algorithms to determine the best approach for setting quotas. Effectively communicate with diverse audiences on data quality issues and initiatives. Develop a thorough understanding of Microsoft's AI and ML toolset (e.g., Azure Machine Learning, Azure Cognitive Services, Azure Databricks). Collaborate with end customers and internal cross-functional stakeholders at Microsoft to understand business needs and create a project roadmap that leads to measurable improvements in business performance metrics. a customer-oriented focus by understanding and validating customer perspectives and focusing on the broader organizational context. Promote and ensure customer adoption by delivering model solutions and supporting relationships. Assume on-call duties as scheduled. Embody our culture and values. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work. Industry leading healthcare Educational resources Discounts on products and services Savings and investments Maternity and paternity leave Generous time away Giving programs Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 1 day ago
11.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Position: AI/ML Lead (Associate Director Level) Experience: 11+ years Mode: Hybrid Salary: Upto 75 LPA We are looking for hiring a AI/ML Lead for one of our well established MNC Client Proficiency in understanding pulling structured and unstructured data from legacy systems to create a data architecture that can drive AI/ML efforts. Ability to engage actively with internal stakeholders, helping them understand the intricacies of AI/ML solutions and factors involved in successful execution. Ability to leverage latest AI offerings (ChatGPT, GPT 4, CoPilots, Tesseract OCR, OCR with CNN) from Microsoft, AWS, IBM and other providers to deliver solutions specific to the business interests Deep hands-on understanding of OCR and BI technologies Proven track record in thought leadership, particularly in IT modernization and innovation, to expedite immigration benefits and solve business problems. Solution Architect mindset, adept in planning data usage for delivering outcomes through multiple applications and continuously improving AI & ML models. Essential Qualifications: 11 years in IT with a focus on enterprise data architecture, ML, and BI 5+ years leading data teams around enterprise scale AI/ML or RPA efforts 3+ years hands-on experience leading teams on use of deep learning frameworks (e.g. Tensorflow or PyTorch) Experience with Prompt Engineering for conversational AI Proficiency in managing big data and distributed computing environments (e.g., Databricks, Spark, Dask). Proven experience in leveraging LLMs for analytics and decision making Highly proficient in AWS or Azure data processing (e.g., EC2, S3, Redshift). Hands-on leadership of Bachelor's degree in Data Science, Statistics, Computer Science, IT Management, Engineering, or similar. Bonus Qualifications: Expertise in Advanced AI Domains: Proficiency in areas such as Natural Language Processing (NLP), OCR with CNN, and AGI showcasing a broad and deep understanding of AI technologies. Client Engagement and Collaboration: Excellent skills in engaging with clients, coupled with a strong ability to collaborate on new ideas and concepts, indicating both interpersonal and creative capabilities. Innovation and Team Leadership: Experience in creating and implementing innovative roadmaps and aligning teams for efficient delivery, reflecting leadership and strategic planning abilities. Data Process Management: Demonstrated expertise in managing and moving data through various processes, ensuring accountability for outcomes, emphasizing operational and execution skills. Work Ethic and Team Spirit: An independent worker who embodies a strong work ethic and team spirit, ensuring a balance between autonomy and collaboration.
Posted 2 days ago
3.0 years
0 Lacs
Greater Kolkata Area
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Develop and implement software solutions to meet business requirements. - Collaborate with cross-functional teams to analyze and address technical issues. - Conduct code reviews and provide feedback to enhance application performance. - Stay updated on emerging technologies and trends in application development. - Assist in troubleshooting and resolving application-related issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Strong understanding of data analytics and data processing techniques. - Experience with cloud-based data platforms like AWS or Azure. - Knowledge of programming languages such as Python, Java, or Scala. - Hands-on experience in developing and deploying applications using Databricks platform. Additional Information: - The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Kolkata office. - A 15 years full-time education is required. 15 years full time education
Posted 2 days ago
3.0 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. 15 years full time education
Posted 2 days ago
0 years
0 Lacs
India
On-site
Shape the Future of AI At Labelbox, we're building the critical infrastructure that powers breakthrough AI models at leading research labs and enterprises. Since 2018, we've been pioneering data-centric approaches that are fundamental to AI development, and our work becomes even more essential as AI capabilities expand exponentially. About Labelbox We're the only company offering three integrated solutions for frontier AI development: Enterprise Platform & Tools: Advanced annotation tools, workflow automation, and quality control systems that enable teams to produce high-quality training data at scale Frontier Data Labeling Service: Specialized data labeling through Aligner, leveraging subject matter experts for next-generation AI models Expert Marketplace: Connecting AI teams with highly skilled annotators and domain experts for flexible scaling Why Join Us High-Impact Environment: We operate like an early-stage startup, focusing on impact over process. You'll take on expanded responsibilities quickly, with career growth directly tied to your contributions. Technical Excellence: Work at the cutting edge of AI development, collaborating with industry leaders and shaping the future of artificial intelligence. Innovation at Speed: We celebrate those who take ownership, move fast, and deliver impact. Our environment rewards high agency and rapid execution. Continuous Growth: Every role requires continuous learning and evolution. You'll be surrounded by curious minds solving complex problems at the frontier of AI. Clear Ownership: You'll know exactly what you're responsible for and have the autonomy to execute. We empower people to drive results through clear ownership and metrics. Role Overview Labelbox is seeking Subject Matter Experts (SMEs) to support high-impact AI data projects across a range of specialized domains, including Math, STEM, Programming, and Internationalization (i18n). As an SME, you’ll bring deep domain knowledge to help shape data labeling workflows, ensure quality, and guide contributor success. This is a contract-based role, deployed per project depending on expertise needs. Employment Type: Project-based, paid hourly Your Impact Design project structures and labeling workflows tailored to domain-specific goals Develop clear contributor guidelines and quality assurance frameworks Define ideal contributor profiles and task acceptance criteria Collaborate with internal teams to ensure subject matter accuracy and relevance What You Bring Proven expertise in one or more relevant domains (e.g., Math, Programming, i18n, etc.) Experience designing or reviewing data workflows, educational content, or technical documentation Strong communication and organizational skills Bonus Points Prior experience with data labeling, ML/AI, or evaluation projects Alignerr Services at Labelbox As part of the Alignerr Services team, you'll lead implementation of customer projects and manage our elite network of AI experts who deliver high-quality human feedback crucial for AI advancement. Your team will oversee 250,000+ monthly hours of specialized work across RLHF, complex reasoning, and multimodal AI projects, resulting in quality improvements for Frontier AI Labs. You'll leverage our AI-powered talent acquisition system and exclusive access to 16M+ specialized professionals to rapidly build and deploy expert teams that help customers like Google and ElevenLabs achieve breakthrough AI capabilities through precisely aligned human data—directly contributing to the critical human element in advancing artificial intelligence. Life at Labelbox Location: Join our dedicated tech hubs in San Francisco or Wrocław, Poland Work Style: Hybrid model with 2 days per week in office, combining collaboration and flexibility Environment: Fast-paced and high-intensity, perfect for ambitious individuals who thrive on ownership and quick decision-making Growth: Career advancement opportunities directly tied to your impact Vision: Be part of building the foundation for humanity's most transformative technology Our Vision We believe data will remain crucial in achieving artificial general intelligence. As AI models become more sophisticated, the need for high-quality, specialized training data will only grow. Join us in developing new products and services that enable the next generation of AI breakthroughs. Labelbox is backed by leading investors including SoftBank, Andreessen Horowitz, B Capital, Gradient Ventures, Databricks Ventures, and Kleiner Perkins. Our customers include Fortune 500 enterprises and leading AI labs. Your Personal Data Privacy : Any personal information you provide Labelbox as a part of your application will be processed in accordance with Labelbox’s Job Applicant Privacy notice. Any emails from Labelbox team members will originate from a @labelbox.com email address. If you encounter anything that raises suspicions during your interactions, we encourage you to exercise caution and suspend or discontinue communications.
Posted 2 days ago
0 years
0 Lacs
India
On-site
Shape the Future of AI At Labelbox, we're building the critical infrastructure that powers breakthrough AI models at leading research labs and enterprises. Since 2018, we've been pioneering data-centric approaches that are fundamental to AI development, and our work becomes even more essential as AI capabilities expand exponentially. About Labelbox We're the only company offering three integrated solutions for frontier AI development: Enterprise Platform & Tools: Advanced annotation tools, workflow automation, and quality control systems that enable teams to produce high-quality training data at scale Frontier Data Labeling Service: Specialized data labeling through Aligner, leveraging subject matter experts for next-generation AI models Expert Marketplace: Connecting AI teams with highly skilled annotators and domain experts for flexible scaling Why Join Us High-Impact Environment: We operate like an early-stage startup, focusing on impact over process. You'll take on expanded responsibilities quickly, with career growth directly tied to your contributions. Technical Excellence: Work at the cutting edge of AI development, collaborating with industry leaders and shaping the future of artificial intelligence. Innovation at Speed: We celebrate those who take ownership, move fast, and deliver impact. Our environment rewards high agency and rapid execution. Continuous Growth: Every role requires continuous learning and evolution. You'll be surrounded by curious minds solving complex problems at the frontier of AI. Clear Ownership: You'll know exactly what you're responsible for and have the autonomy to execute. We empower people to drive results through clear ownership and metrics. Role Overview Labelbox is seeking Subject Matter Experts (SMEs) to support high-impact AI data projects across a range of specialized domains, including Math, STEM, Programming, and Internationalization (i18n). As an SME, you’ll bring deep domain knowledge to help shape data labeling workflows, ensure quality, and guide contributor success. This is a contract-based role, deployed per project depending on expertise needs. Employment Type: Project-based, paid hourly Your Impact Design project structures and labeling workflows tailored to domain-specific goals Develop clear contributor guidelines and quality assurance frameworks Define ideal contributor profiles and task acceptance criteria Collaborate with internal teams to ensure subject matter accuracy and relevance What You Bring Proven expertise in one or more relevant domains (e.g., Math, Programming, i18n, etc.) Experience designing or reviewing data workflows, educational content, or technical documentation Strong communication and organizational skills Bonus Points Prior experience with data labeling, ML/AI, or evaluation projects Alignerr Services at Labelbox As part of the Alignerr Services team, you'll lead implementation of customer projects and manage our elite network of AI experts who deliver high-quality human feedback crucial for AI advancement. Your team will oversee 250,000+ monthly hours of specialized work across RLHF, complex reasoning, and multimodal AI projects, resulting in quality improvements for Frontier AI Labs. You'll leverage our AI-powered talent acquisition system and exclusive access to 16M+ specialized professionals to rapidly build and deploy expert teams that help customers like Google and ElevenLabs achieve breakthrough AI capabilities through precisely aligned human data—directly contributing to the critical human element in advancing artificial intelligence. Life at Labelbox Location: Join our dedicated tech hubs in San Francisco or Wrocław, Poland Work Style: Hybrid model with 2 days per week in office, combining collaboration and flexibility Environment: Fast-paced and high-intensity, perfect for ambitious individuals who thrive on ownership and quick decision-making Growth: Career advancement opportunities directly tied to your impact Vision: Be part of building the foundation for humanity's most transformative technology Our Vision We believe data will remain crucial in achieving artificial general intelligence. As AI models become more sophisticated, the need for high-quality, specialized training data will only grow. Join us in developing new products and services that enable the next generation of AI breakthroughs. Labelbox is backed by leading investors including SoftBank, Andreessen Horowitz, B Capital, Gradient Ventures, Databricks Ventures, and Kleiner Perkins. Our customers include Fortune 500 enterprises and leading AI labs. Your Personal Data Privacy : Any personal information you provide Labelbox as a part of your application will be processed in accordance with Labelbox’s Job Applicant Privacy notice. Any emails from Labelbox team members will originate from a @labelbox.com email address. If you encounter anything that raises suspicions during your interactions, we encourage you to exercise caution and suspend or discontinue communications.
Posted 2 days ago
0 years
0 Lacs
India
On-site
Shape the Future of AI At Labelbox, we're building the critical infrastructure that powers breakthrough AI models at leading research labs and enterprises. Since 2018, we've been pioneering data-centric approaches that are fundamental to AI development, and our work becomes even more essential as AI capabilities expand exponentially. About Labelbox We're the only company offering three integrated solutions for frontier AI development: Enterprise Platform & Tools: Advanced annotation tools, workflow automation, and quality control systems that enable teams to produce high-quality training data at scale Frontier Data Labeling Service: Specialized data labeling through Aligner, leveraging subject matter experts for next-generation AI models Expert Marketplace: Connecting AI teams with highly skilled annotators and domain experts for flexible scaling Why Join Us High-Impact Environment: We operate like an early-stage startup, focusing on impact over process. You'll take on expanded responsibilities quickly, with career growth directly tied to your contributions. Technical Excellence: Work at the cutting edge of AI development, collaborating with industry leaders and shaping the future of artificial intelligence. Innovation at Speed: We celebrate those who take ownership, move fast, and deliver impact. Our environment rewards high agency and rapid execution. Continuous Growth: Every role requires continuous learning and evolution. You'll be surrounded by curious minds solving complex problems at the frontier of AI. Clear Ownership: You'll know exactly what you're responsible for and have the autonomy to execute. We empower people to drive results through clear ownership and metrics. Role Overview Labelbox is seeking Subject Matter Experts (SMEs) to support high-impact AI data projects across a range of specialized domains, including Math, STEM, Programming, and Internationalization (i18n). As an SME, you’ll bring deep domain knowledge to help shape data labeling workflows, ensure quality, and guide contributor success. This is a contract-based role, deployed per project depending on expertise needs. Employment Type: Project-based, paid hourly Your Impact Design project structures and labeling workflows tailored to domain-specific goals Develop clear contributor guidelines and quality assurance frameworks Define ideal contributor profiles and task acceptance criteria Collaborate with internal teams to ensure subject matter accuracy and relevance What You Bring Proven expertise in one or more relevant domains (e.g., Math, Programming, i18n, etc.) Experience designing or reviewing data workflows, educational content, or technical documentation Strong communication and organizational skills Bonus Points Prior experience with data labeling, ML/AI, or evaluation projects Alignerr Services at Labelbox As part of the Alignerr Services team, you'll lead implementation of customer projects and manage our elite network of AI experts who deliver high-quality human feedback crucial for AI advancement. Your team will oversee 250,000+ monthly hours of specialized work across RLHF, complex reasoning, and multimodal AI projects, resulting in quality improvements for Frontier AI Labs. You'll leverage our AI-powered talent acquisition system and exclusive access to 16M+ specialized professionals to rapidly build and deploy expert teams that help customers like Google and ElevenLabs achieve breakthrough AI capabilities through precisely aligned human data—directly contributing to the critical human element in advancing artificial intelligence. Life at Labelbox Location: Join our dedicated tech hubs in San Francisco or Wrocław, Poland Work Style: Hybrid model with 2 days per week in office, combining collaboration and flexibility Environment: Fast-paced and high-intensity, perfect for ambitious individuals who thrive on ownership and quick decision-making Growth: Career advancement opportunities directly tied to your impact Vision: Be part of building the foundation for humanity's most transformative technology Our Vision We believe data will remain crucial in achieving artificial general intelligence. As AI models become more sophisticated, the need for high-quality, specialized training data will only grow. Join us in developing new products and services that enable the next generation of AI breakthroughs. Labelbox is backed by leading investors including SoftBank, Andreessen Horowitz, B Capital, Gradient Ventures, Databricks Ventures, and Kleiner Perkins. Our customers include Fortune 500 enterprises and leading AI labs. Your Personal Data Privacy : Any personal information you provide Labelbox as a part of your application will be processed in accordance with Labelbox’s Job Applicant Privacy notice. Any emails from Labelbox team members will originate from a @labelbox.com email address. If you encounter anything that raises suspicions during your interactions, we encourage you to exercise caution and suspend or discontinue communications.
Posted 2 days ago
0 years
0 Lacs
India
On-site
Shape the Future of AI At Labelbox, we're building the critical infrastructure that powers breakthrough AI models at leading research labs and enterprises. Since 2018, we've been pioneering data-centric approaches that are fundamental to AI development, and our work becomes even more essential as AI capabilities expand exponentially. About Labelbox We're the only company offering three integrated solutions for frontier AI development: Enterprise Platform & Tools: Advanced annotation tools, workflow automation, and quality control systems that enable teams to produce high-quality training data at scale Frontier Data Labeling Service: Specialized data labeling through Aligner, leveraging subject matter experts for next-generation AI models Expert Marketplace: Connecting AI teams with highly skilled annotators and domain experts for flexible scaling Why Join Us High-Impact Environment: We operate like an early-stage startup, focusing on impact over process. You'll take on expanded responsibilities quickly, with career growth directly tied to your contributions. Technical Excellence: Work at the cutting edge of AI development, collaborating with industry leaders and shaping the future of artificial intelligence. Innovation at Speed: We celebrate those who take ownership, move fast, and deliver impact. Our environment rewards high agency and rapid execution. Continuous Growth: Every role requires continuous learning and evolution. You'll be surrounded by curious minds solving complex problems at the frontier of AI. Clear Ownership: You'll know exactly what you're responsible for and have the autonomy to execute. We empower people to drive results through clear ownership and metrics. Role Overview Labelbox is seeking Subject Matter Experts (SMEs) to support high-impact AI data projects across a range of specialized domains, including Math, STEM, Programming, and Internationalization (i18n). As an SME, you’ll bring deep domain knowledge to help shape data labeling workflows, ensure quality, and guide contributor success. This is a contract-based role, deployed per project depending on expertise needs. Employment Type: Project-based, paid hourly Your Impact Design project structures and labeling workflows tailored to domain-specific goals Develop clear contributor guidelines and quality assurance frameworks Define ideal contributor profiles and task acceptance criteria Collaborate with internal teams to ensure subject matter accuracy and relevance What You Bring Proven expertise in one or more relevant domains (e.g., Math, Programming, i18n, etc.) Experience designing or reviewing data workflows, educational content, or technical documentation Strong communication and organizational skills Bonus Points Prior experience with data labeling, ML/AI, or evaluation projects Alignerr Services at Labelbox As part of the Alignerr Services team, you'll lead implementation of customer projects and manage our elite network of AI experts who deliver high-quality human feedback crucial for AI advancement. Your team will oversee 250,000+ monthly hours of specialized work across RLHF, complex reasoning, and multimodal AI projects, resulting in quality improvements for Frontier AI Labs. You'll leverage our AI-powered talent acquisition system and exclusive access to 16M+ specialized professionals to rapidly build and deploy expert teams that help customers like Google and ElevenLabs achieve breakthrough AI capabilities through precisely aligned human data—directly contributing to the critical human element in advancing artificial intelligence. Life at Labelbox Location: Join our dedicated tech hubs in San Francisco or Wrocław, Poland Work Style: Hybrid model with 2 days per week in office, combining collaboration and flexibility Environment: Fast-paced and high-intensity, perfect for ambitious individuals who thrive on ownership and quick decision-making Growth: Career advancement opportunities directly tied to your impact Vision: Be part of building the foundation for humanity's most transformative technology Our Vision We believe data will remain crucial in achieving artificial general intelligence. As AI models become more sophisticated, the need for high-quality, specialized training data will only grow. Join us in developing new products and services that enable the next generation of AI breakthroughs. Labelbox is backed by leading investors including SoftBank, Andreessen Horowitz, B Capital, Gradient Ventures, Databricks Ventures, and Kleiner Perkins. Our customers include Fortune 500 enterprises and leading AI labs. Your Personal Data Privacy : Any personal information you provide Labelbox as a part of your application will be processed in accordance with Labelbox’s Job Applicant Privacy notice. Any emails from Labelbox team members will originate from a @labelbox.com email address. If you encounter anything that raises suspicions during your interactions, we encourage you to exercise caution and suspend or discontinue communications.
Posted 2 days ago
0 years
0 Lacs
India
On-site
Shape the Future of AI At Labelbox, we're building the critical infrastructure that powers breakthrough AI models at leading research labs and enterprises. Since 2018, we've been pioneering data-centric approaches that are fundamental to AI development, and our work becomes even more essential as AI capabilities expand exponentially. About Labelbox We're the only company offering three integrated solutions for frontier AI development: Enterprise Platform & Tools: Advanced annotation tools, workflow automation, and quality control systems that enable teams to produce high-quality training data at scale Frontier Data Labeling Service: Specialized data labeling through Aligner, leveraging subject matter experts for next-generation AI models Expert Marketplace: Connecting AI teams with highly skilled annotators and domain experts for flexible scaling Why Join Us High-Impact Environment: We operate like an early-stage startup, focusing on impact over process. You'll take on expanded responsibilities quickly, with career growth directly tied to your contributions. Technical Excellence: Work at the cutting edge of AI development, collaborating with industry leaders and shaping the future of artificial intelligence. Innovation at Speed: We celebrate those who take ownership, move fast, and deliver impact. Our environment rewards high agency and rapid execution. Continuous Growth: Every role requires continuous learning and evolution. You'll be surrounded by curious minds solving complex problems at the frontier of AI. Clear Ownership: You'll know exactly what you're responsible for and have the autonomy to execute. We empower people to drive results through clear ownership and metrics. Role Overview Labelbox is seeking Subject Matter Experts (SMEs) to support high-impact AI data projects across a range of specialized domains, including Math, STEM, Programming, and Internationalization (i18n). As an SME, you’ll bring deep domain knowledge to help shape data labeling workflows, ensure quality, and guide contributor success. This is a contract-based role, deployed per project depending on expertise needs. Employment Type: Project-based, paid hourly Your Impact Design project structures and labeling workflows tailored to domain-specific goals Develop clear contributor guidelines and quality assurance frameworks Define ideal contributor profiles and task acceptance criteria Collaborate with internal teams to ensure subject matter accuracy and relevance What You Bring Proven expertise in one or more relevant domains (e.g., Math, Programming, i18n, etc.) Experience designing or reviewing data workflows, educational content, or technical documentation Strong communication and organizational skills Bonus Points Prior experience with data labeling, ML/AI, or evaluation projects Alignerr Services at Labelbox As part of the Alignerr Services team, you'll lead implementation of customer projects and manage our elite network of AI experts who deliver high-quality human feedback crucial for AI advancement. Your team will oversee 250,000+ monthly hours of specialized work across RLHF, complex reasoning, and multimodal AI projects, resulting in quality improvements for Frontier AI Labs. You'll leverage our AI-powered talent acquisition system and exclusive access to 16M+ specialized professionals to rapidly build and deploy expert teams that help customers like Google and ElevenLabs achieve breakthrough AI capabilities through precisely aligned human data—directly contributing to the critical human element in advancing artificial intelligence. Life at Labelbox Location: Join our dedicated tech hubs in San Francisco or Wrocław, Poland Work Style: Hybrid model with 2 days per week in office, combining collaboration and flexibility Environment: Fast-paced and high-intensity, perfect for ambitious individuals who thrive on ownership and quick decision-making Growth: Career advancement opportunities directly tied to your impact Vision: Be part of building the foundation for humanity's most transformative technology Our Vision We believe data will remain crucial in achieving artificial general intelligence. As AI models become more sophisticated, the need for high-quality, specialized training data will only grow. Join us in developing new products and services that enable the next generation of AI breakthroughs. Labelbox is backed by leading investors including SoftBank, Andreessen Horowitz, B Capital, Gradient Ventures, Databricks Ventures, and Kleiner Perkins. Our customers include Fortune 500 enterprises and leading AI labs. Your Personal Data Privacy : Any personal information you provide Labelbox as a part of your application will be processed in accordance with Labelbox’s Job Applicant Privacy notice. Any emails from Labelbox team members will originate from a @labelbox.com email address. If you encounter anything that raises suspicions during your interactions, we encourage you to exercise caution and suspend or discontinue communications.
Posted 2 days ago
3.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. 15 years full time education
Posted 2 days ago
5.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Collibra Data Quality & Observability Good to have skills : Collibra Data Governance Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are functioning optimally. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Key Responsibilities: Configure and implement Collibra Data Quality (CDQ) rules, workflows, dashboards, and data quality scoring metrics. Collaborate with data stewards, data owners, and business analysts to define data quality KPIs and thresholds. Develop data profiling and rule-based monitoring using CDQ's native rule engine or integrations (e.g., with Informatica, Talend, or BigQuery). Build and maintain Data Quality Dashboards and Issue Management workflows within Collibra. Integrate CDQ with Collibra Data Intelligence Cloud for end-to-end governance visibility. Drive root cause analysis and remediation plans for data quality issues. Support metadata and lineage enrichment to improve data traceability. Document standards, rule logic, and DQ policies in the Collibra Catalog. Conduct user training and promote data quality best practices across teams. Required Skills and Experience: 3+ years of experience in data quality, metadata management, or data governance. Hands-on experience with Collibra Data Quality & Observability (CDQ) platform. Knowledge of Collibra Data Intelligence Cloud including Catalog, Glossary, and Workflow Designer. Proficiency in SQL and understanding of data profiling techniques. Experience integrating CDQ with enterprise data sources (Snowflake, BigQuery, Databricks, etc.). Familiarity with data governance frameworks and data quality dimensions (accuracy, completeness, consistency, etc.). Excellent analytical, problem-solving, and communication skills. Additional Information: - The candidate should have minimum 7.5 years of experience in Collibra Data Quality & Observability. - This position is based in Mumbai. - A 15 years full time education is required. 15 years full time education
Posted 2 days ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Analytics Services Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : BE Summary: As an Application Lead for Packaged Application Development, you will be responsible for designing, building, and configuring applications using Microsoft Azure Analytics Services. Your typical day will involve leading the effort to deliver high-quality applications, acting as the primary point of contact for the project team, and ensuring timely delivery of project milestones. Roles & Responsibilities: - Lead the effort to design, build, and configure applications using Microsoft Azure Analytics Services. - Act as the primary point of contact for the project team, ensuring timely delivery of project milestones. - Collaborate with cross-functional teams to ensure the successful delivery of high-quality applications. - Provide technical guidance and mentorship to team members, ensuring adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Strong experience with Microsoft Azure Analytics Services. - Good To Have Skills: Experience with other Azure services such as Azure Data Factory, Azure Databricks, and Azure Synapse Analytics. - Experience in designing, building, and configuring applications using Microsoft Azure Analytics Services. - Must have databricks and pyspark Skills. - Strong understanding of data warehousing concepts and best practices. - Experience with ETL processes and tools such as SSIS or Azure Data Factory. - Experience with SQL and NoSQL databases. - Experience with Agile development methodologies. Additional Information: - The candidate should have a minimum of 5 years of experience in Microsoft Azure Analytics Services. - The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering high-quality applications. - This position is based at our Bengaluru office. BE
Posted 2 days ago
7.0 years
0 Lacs
Ahmedabad, Gujarat, India
Remote
Job Title : Technical Project Manager Location : Ahmedabad, Gujarat Job Type : Full Time Department : Project management About Simform Simform is a premier digital engineering company specializing in Cloud, Data, AI/ML, and Experience Engineering to create seamless digital experiences and scalable products. Simform is a strong partner for Microsoft, AWS, Google Cloud, and Databricks. With a presence in 5+ countries, Simform primarily serves North America, the UK, and the Northern European market. Simform takes pride in being one of the most reputed employers in the region, having created a thriving work culture with a high work-life balance that gives a sense of freedom and opportunity to grow. Role Overview We are a digital product engineering company that partners with clients to create innovative, high-performing software solutions. We're looking for an experienced Project Manager who can lead from the front-bridging the gap between business and technology while ensuring smooth project execution.. Key Responsibilities Lead Agile/Scrum Projects - Manage the full project lifecycle, from planning to execution, ensuring timely delivery. Define & Refine Requirements - Gather and transform client needs into clear documentation, user stories, and deliverables. Sprint Planning & Execution - Assign tasks, run daily stand-ups, track progress, and mitigate risks. Client Communication - Be the point of contact for stakeholders, providing clear updates on scope, timelines, and deliverables. Technical Oversight - Leverage your expertise in web, mobile, and cloud technologies to guide development teams and ensure quality. Architectural & Design Guidance - Help define project architecture, offer technical mentorship, and ensure best practices. Quality Assurance - Oversee testing, release planning, and ensure glitch-free, high-performing applications. UX & Design Collaboration - Work closely with designers to maintain visual and UX design excellence, including responsive design principles. Required Skills & Qualifications 7+ years of technical experience, working with multiple technologies and understanding their core concepts. Proven leadership in managing and mentoring teams to successfully deliver projects. Expertise in Agile/SCRUM methodologies for medium-to-large scale applications. Strong interpersonal skills - mentoring, coaching, collaborating, and building high-performing teams. Problem-solving mindset - ability to zoom into details while keeping the big picture in focus. Solid grasp of design patterns and software architecture best practices. Hands-on experience managing projects from concept to production deployment. Why Join Us Young Team, Thriving Culture Flat-hierarchical, friendly, engineering-oriented, and growth-focused culture. Well-balanced learning and growth opportunities Free health insurance. Office facilities with a game zone, in-office kitchen with affordable lunch service, and free snacks. Sponsorship for certifications/events and library service. Flexible work timing, leaves for life events, WFH and hybrid options (ref:hirist.tech)
Posted 2 days ago
12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role Overview The Technical Architect - Databricks designs and implements scalable data architectures and solutions. The jobholder has expertise in Databricks Lakehouse, data modeling, and cloud integration, ensuring high performance, security, and reliability. Responsibilities Design and implement Databricks-based data architectures to meet business requirements. Develop and optimize data pipelines using PySpark, Scala, or SQL. Establish the Databricks Lakehouse architecture for batch and streaming data. Collaborate with cross-functional teams to integrate Databricks with cloud platforms (e.g., AWS, Azure, GCP). Ensure data security and compliance with best practices. Monitor and troubleshoot Databricks environments for performance and reliability. Stay updated on Databricks advancements and industry trends. Key Technical Skills & Responsibilities 12+ years of experience in data engineering using Databricks or Apache Spark-based platforms. Proven track record of building and optimizing ETL/ELT pipelines for batch and streaming data ingestion. Hands-on experience with Azure services such as Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, or Azure SQL Data Warehouse. Proficiency in programming languages such as Python, Scala, or SQL for data processing and transformation. Expertise in Spark (PySpark, Spark SQL, or Scala) and Databricks notebooks for large-scale data processing. Familiarity with Delta Lake, Delta Live Tables, and medallion architecture for data lakehouse implementations. Build and query deltalake storage solutions Process streaming data with Azure Databricks structured streaming Design Azure Databricks security and data protection solutions Flatten nested structures and explode arrays with spark Transfer data outside using sparkpools using pyspark connector Optimizing spark jobs Implementing best practices in spark/databricks Experience with orchestration tools like Azure Data Factory or Databricks Jobs for scheduling and automation. Knowledge of Git for source control and CI/CD integration for Databricks workflows, cost optimization, performance tuning. Familiarity with Unity Catalog, RBAC, or enterprise-level Databricks setups. Ability to create reusable components, templates, and documentation to standardize data engineering workflows. Solutioning and presales - Architecting frameworks, defining roadmaps, and engaging with stakeholders. Experience in defining data strategy, evaluating new tools/technologies, and driving adoption across the organization. Must have experience of working with streaming data sources and Kafka (preferred). Eligibility Criteria Bachelor’s degree in computer science, Information Technology, or related field Proven experience as a Databricks Architect or similar role Complete knowledge in Azure Databricks platform architecture Databricks certification (e.g., Certified Data Engineer, Associate Developer) Expertise in Python/Scala/ SQL/R Experience with cloud platforms like AWS, Azure, or GCP Strong understanding of data modeling and cloud integration Experience with cluster sizing and security implementation Excellent problem-solving and communication skills
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane