Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Budget : 15-20 LPA NP- immediate Location-Technopark, trivandrum Exprience- 5+ years We are seeking a highly skilled and experienced Senior AI Engineer with a minimum of 5 years of hands-on experience in designing, developing, and deploying robust Artificial Intelligence and Machine Learning solutions. The ideal candidate will be a strong problem-solver, adept at translating complex business challenges into scalable AI models, and capable of working across the entire AI/ML lifecycle, from data acquisition to model deployment and monitoring. This role requires a deep understanding of various AI techniques, strong programming skills, and a passion for staying updated with the latest advancements in the field. Key Responsibilities AI/ML Solution Design & Development: ○ Lead the design and development of innovative and scalable AI/ML models and algorithms to address specific business needs and optimize processes. ○ Apply various machine learning techniques including supervised, unsupervised, and reinforcement learning, deep learning, natural language processing (NLP), and computer vision. ○ Collaborate with data scientists, product managers, and other stakeholders to understand business requirements and translate them into technical specifications. Data Management & Preprocessing: ○ Oversee the collection, preprocessing, cleaning, and transformation of large and complex datasets to prepare them for AI model training. ○ Implement efficient data pipelines and ensure data quality and integrity. ○ Perform exploratory data analysis to uncover insights and inform model development. Model Training, Evaluation & Optimization: ○ Train, fine-tune, and evaluate AI/ML models for optimal accuracy, performance, and generalization. ○ Select the most suitable models and algorithms for specific tasks and optimize hyperparameters. ○ Conduct rigorous testing and debugging of AI systems to ensure reliability and desired outcomes. Deployment & MLOps: ○ Lead the productionization of AI/ML models, ensuring seamless integration with existing systems and applications. ○ Implement MLOps best practices for model versioning, deployment, monitoring, and retraining. ○ Develop and maintain APIs for AI model integration. Research & Innovation: ○ Continuously research and evaluate the latest advancements in AI/ML research, tools, and technologies. ○ Propose and implement innovative solutions to complex problems. ○ Contribute to the strategic direction of AI initiatives within the company. Collaboration & Mentorship: ○ Work collaboratively with cross-functional teams (e.g., software development, data science, product teams). ○ Clearly articulate complex AI concepts to both technical and non-technical audiences. ○ Mentor junior AI engineers and contribute to a culture of continuous learning. Required Skills And Qualifications Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Machine Learning, Data Science, or a related quantitative field. Minimum of 5 years of professional experience as an AI Engineer, Machine Learning Engineer, or a similar role. Expert-level proficiency in at least one major programming language used for AI development, such as Python (preferred), Java, or C++. Extensive experience with popular AI/ML frameworks and libraries, such as TensorFlow, PyTorch, Keras, Scikit-learn, Hugging Face Transformers. Strong understanding of core machine learning concepts, algorithms, and statistical modeling (e.g., regression, classification, clustering, dimensionality reduction). Solid knowledge of deep learning architectures (e.g., CNNs, RNNs, Transformers) and their applications. Experience with data manipulation and analysis libraries (e.g., Pandas, NumPy). Familiarity with database systems (SQL, NoSQL) and big data technologies (e.g., Apache Spark, Hadoop) for managing large datasets. Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their AI/ML services for scalable deployment. Understanding of software development best practices, including version control (Git), testing, and code review. Excellent problem-solving skills, analytical thinking, and a data-driven approach. Strong communication and interpersonal skills, with the ability to explain technical concepts clearly. Ability to work independently and as part of a collaborative team in a fast-paced environment. Immediate Joinee preferred
Posted 4 days ago
15.0 years
0 Lacs
Gurugram, Haryana, India
Remote
About VWO VWO is a leading Digital Experience Optimization platform trusted by over 3,000 businesses in 100+ countries, including global brands like Samsung, Vodafone, Toyota, HBO, and Domino's. What began 15 years ago as one of the world’s first A/B testing tools has since evolved into a comprehensive, enterprise-grade platform used by product, marketing, and growth teams to experiment, personalize, analyze behavior, and build exceptional digital experiences. Today, VWO offers a full-stack suite for A/B testing, multivariate testing, feature rollouts, heatmaps, session recordings, behavioural analytics, surveys, personalization, and more across web, mobile, and server-side applications — all in one unified platform. We enable teams to make confident, data-driven decisions that drive user engagement, retention, and conversion. VWO is a profitable, founder-led business with $50M+ ARR, strong EBITDA margins, and a history of capital-efficient, sustainable growth. In January 2025, Everstone Capital acquired a majority stake in the company to help accelerate our global expansion, both organically and inorganically. We are a fully remote team of 450+ people, with go-to-market teams across the Americas, Europe, and APAC, and product and engineering anchored in India. Our culture values deep thinking, fast execution, and strong ownership, with minimal bureaucracy and high autonomy. Despite our scale, we continue to operate with the agility and ambition of a startup. We are seeking a Principal Data Architect to lead our Data Platforms Team and drive innovation in data engineering, analytics, and governance. Role Overview: As a Principal Data Architect, you will be responsible for leading the design, development, and scaling of our data infrastructure. You will collaborate with product managers, data scientists, and engineers to ensure our data pipelines and architectures are robust, scalable, and aligned with business objectives. The role requires a strong background in backend development, data processing, and scalable architecture design. Key Responsibilities: ● Lead and mentor a team of data engineers, ensuring high performance and career growth. ● Architect and optimize scalable data infrastructure, ensuring high availability and reliability. ● Drive the development and implementation of data governance frameworks and best practices. ● Work closely with cross-functional teams to define and execute a data roadmap. ● Optimize data processing workflows for performance and cost efficiency. ● Ensure data security, compliance, and quality across all data platforms. ● Foster a culture of innovation and technical excellence within the data team. Required Skills & Experience: ● 10+ years of experience in data engineering, with at least 3+ years in a leadership role. ● Expertise in backend development with programming languages such as Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS. ● Proficiency in SQL, Python, and Scala for data processing and analytics. ● Strong understanding of cloud platforms (AWS, GCP, or Azure) and their data services. ● Strong foundation and expertise in HLD and LLD, as well as design patterns, preferably using Spring Boot or Google Guice ● Experience in big data technologies such as Spark, Hadoop, Kafka, and distributed computing frameworks. ● Hands-on experience with data warehousing solutions such as Snowflake, Redshift, or BigQuery ● Deep knowledge of data governance, security, and compliance (GDPR, SOC2, etc.). ● Experience in NoSQL databases like Redis, Cassandra, MongoDB, and TiDB. ● Familiarity with automation and DevOps tools like Jenkins, Ansible, Docker, Kubernetes, Chef, Grafana, and ELK. ● Proven ability to drive technical strategy and align it with business objectives. ● Strong leadership, communication, and stakeholder management skills. Preferred Qualifications : ● Experience in machine learning infrastructure or MLOps is a plus. ● Exposure to real-time data processing and analytics. ● Interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture. ● Prior experience in a SaaS or high-growth tech company. Why Join Wingify? ● Work on cutting-edge data technologies and large-scale distributed systems. ● A remote-friendly work environment with flexible working hours. ● A culture of innovation, learning, and open communication. ● Competitive compensation, benefits, and growth opportunities. Join us in shaping the future of data-driven decision-making at Wingify!
Posted 4 days ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Key Responsibilities: · Design, develop, and deploy AI/ML models and algorithms. · Collaborate with cross-functional teams (data scientists, engineers, product managers) to identify opportunities for AI integration. · Optimize machine learning pipelines for performance and scalability. · Collect, clean, and preprocess large datasets for training and inference. · Conduct model evaluation, A/B testing, and continuous improvements. · Implement deep learning, NLP, computer vision, or generative AI solutions as needed. · Stay up-to-date with the latest AI research, tools, and best practices. · Ensure responsible AI development, addressing ethical, fairness, and bias concerns. Required Qualifications: · Bachelor’s or Master’s degree in Computer Science, AI, Data Science, Engineering, or a related field. PhD is a plus. · 2+ years of experience in AI/ML development (for mid-level roles). · Proficiency in Python and ML frameworks (TensorFlow, PyTorch, Scikit-learn). · Strong understanding of machine learning algorithms, data structures, and statistical modeling. · Experience with cloud platforms (AWS, GCP, Azure) and MLOps tools. · Familiarity with version control, containerization (Docker), and CI/CD pipelines. Preferred Qualifications: · Experience in NLP, computer vision, LLMs, or generative AI. · Knowledge of data engineering tools (Spark, Hadoop, Airflow). · Contributions to open-source AI projects or research publications. · Strong problem-solving, communication, and teamwork skills. Location: Chennai E-mail: hr@wethreee.com
Posted 4 days ago
1.0 - 3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Position:** ETL Developer Client Requirements Experience:** 1-3 years Location: Gurgaon Employment Type - Full time budget - Upto 35,000/ -40,000/ We are looking for a passionate and detail-oriented **ETL Developer** with 1 to 3 years of experience in building, testing, and maintaining ETL processes. The ideal candidate should have a strong understanding of data warehousing concepts, ETL tools, and database technologies. ### **Key Responsibilities:** ✅ Design, develop, and maintain ETL workflows and processes using \[specify tools e.g., Informatica / Talend / SSIS / Pentaho / custom ETL frameworks]. ✅ Understand data requirements and translate them into technical specifications and ETL designs. ✅ Optimize and troubleshoot ETL processes for performance and scalability. ✅ Ensure data quality, integrity, and security across all ETL jobs. ✅ Perform data analysis and validation for business reporting. ✅ Collaborate with Data Engineers, DBAs, and Business Analysts to ensure smooth data operations. --- ### **Required Skills:** * 1-3 years of hands-on experience with ETL tools (e.g., **Informatica, Talend, SSIS, Pentaho**, or equivalent). * Proficiency in SQL and experience working with **RDBMS** (e.g., **SQL Server, Oracle, MySQL, PostgreSQL**). * Good understanding of **data warehousing concepts** and **data modeling**. * Experience in handling **large datasets** and performance tuning of ETL jobs. * Ability to work in Agile environments and participate in code reviews. --- ### **Preferred Skills (Good to Have):** * Experience with **cloud ETL solutions** (AWS Glue, Azure Data Factory, GCP Dataflow). * Exposure to **big data ecosystems** (Hadoop, Spark). * Basic knowledge of **Python / Shell scripting** for automation. * Familiarity with **version control (Git)** and **CI/CD pipelines 🎓 Bachelor’s degree in Computer Science, Engineering, Information Technology, or related field.
Posted 4 days ago
0.0 - 3.0 years
0 - 0 Lacs
Gomtinagar, Lucknow, Uttar Pradesh
On-site
Job Title: Sr. ASP.NET / .NET Developer Location: Lucknow Experience: 4 to 5 years Employment Type: Full-Time Job Summary: We are looking for a passionate and experienced ASP.NET / .NET Developer with 4 to 5 years of hands-on experience in developing scalable web applications and services using the Microsoft .NET technology stack. You will be involved in the full software development lifecycle, including coding, database design, API integration, and collaborating with the frontend team. This role also provides opportunities to contribute to system design and mentor junior team members. Key Responsibilities: Develop and maintain web applications using ASP.NET, MVC/Core, and C#. Design, develop, and optimize databases using Microsoft SQL Server. Build and integrate RESTful APIs for frontend and third-party applications. Participate in code reviews and contribute to discussions on system design and architecture. Troubleshoot, debug, and resolve technical issues in existing applications. Participate in Agile/Scrum ceremonies including sprint planning and retrospectives. Follow secure coding practices and adhere to development standards. Support and mentor junior developers when needed. Required Skills: Hands-on experience with ASP.NET, ASP.NET MVC, ASP.NET Core, and .NET Framework 4.5+. Strong proficiency in C# programming. Experience in developing and integrating RESTful APIs. Solid understanding of Microsoft SQL Server including T-SQL and database design. Proficiency with front-end technologies such as HTML5, CSS3, JavaScript, jQuery, Bootstrap, and AngularJS. Familiarity with version control tools such as Git. Understanding of object-oriented programming (OOP) principles. Preferred Skills: Experience with desktop application development using Windows Forms or WPF. Familiarity with modern JavaScript frameworks like Angular or React. Exposure to Azure DevOps, CI/CD pipelines, or cloud-based deployments. Knowledge of Big Data technologies (e.g., Hadoop, Spark) is a plus. Familiarity with Agile/Scrum methodology. Qualifications: Bachelor’s degree in Computer Science, Engineering, or a related field. 4 to 5 years of hands-on experience in .NET development. Job Types: Full-time, Permanent Pay: ₹45,000.00 - ₹60,000.00 per month Benefits: Paid sick time Provident Fund Location Type: In-person Schedule: Fixed shift Ability to commute/relocate: Gomtinagar, Lucknow, Uttar Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Experience: .NET Core: 3 years (Preferred) Location: Gomtinagar, Lucknow, Uttar Pradesh (Preferred) Work Location: In person Speak with the employer +91 9598071606
Posted 4 days ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Your Impact: As a developer you will work as part of a highly skilled team of professionals who are responsible for architecture, designing and developing of cost effective and sustainable solutions for Security Product business of OpenText. Strong organizational skills, technical expertise and attention to detail are key in this customer-focused role. What the role offers: Translate business requirements using complex methods/models to determine appropriate system solutions Work within a cross-functional team to provide technical expertise in the design and planning of system solutions. Research, identify, test, certify, and select technology required for solution delivery. Maximize the performance, uptime, and supportability of the product. Developing highly scalable Security product using technologies such as Java, J2EE, REST, Azure, Aws, GCP and Snowflake. Working with team to design solutions to security problems, monitor and analyze the security vulnerabilities reported in bundled 3rd party products. Designs and implements new interface components in collaboration with the product owner and other OpenText development teams. Collaborates with engineer and development partners to develop reliable, cost-effective, and high-quality software solutions. Maintains the existing components and resolves problems reported by customers. Enhances existing components with new capabilities whilst maintaining compatibility. Provide feedback on test plans, test cases, and test methodologies. Research new technologies for product improvements and future roadmap. Communicate with stakeholders, provide project progress, highlight any risks involved along with mitigation plan. What you need to succeed: Bachelor's or masters degree in computer science, Information Systems, or equivalent. 2-5 years of software development experience building large-scale and highly distributed applications. Developing highly scalable Security product using technologies such as Java, J2EE, REST/SOAP, AWS, GCP, Snowflake, Azure. Demonstrated ability to have completed multiple, complex technical projects. Strong programming skills in Java, J2EE. Experience in Cloud (Aws or GCP or Azure) is must. Experience in working in Devops, Continuous Integration environment. Excellent communication skills and ability to interact effectively with both technical and non-technical staff In-depth technical experience in IT infrastructure area, Understanding of operational challenges involved in managing complex systems Previous experience in being a part of complex integration projects Technical execution of project activities and responsibilities for on-time delivery and results. Interfacing with customer facing functions to gather project requirements and performing due diligence as required. Providing technical guidance for trouble shooting and issue resolution when needed. Familiarity with Agile Software Development (preferably Scrum). Unit testing and mock framework like mockito. Desired Skills Understanding of the Security domain. Experience in Azure, Aws, GCP and Hadoop. Working knowledge in Linux. Cloud technologies and cloud application development. Good knowledge about the security threat models and good knowledge of various security encryption techniques. Knowledge of different types of security vulnerabilities, attack vectors and common type of cyberattacks.
Posted 4 days ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
YOUR IMPACT As a Sr. Software Engineer you will utilize superior knowledge and experience to perform highly complex product architecture, design, systems analysis, research, maintenance, troubleshooting and other programming activities. You will also play a key role in the development of work teams by providing others with direction and leadership. What The Role Offers Produce high quality code according to design specifications Utilize superior analytical skills to troubleshoot and fix highly complex code defects Define best practices and standards for database design. Serve as a thought leader, mentor and advocate for best practices concerning data Architecture Lead software design/code reviews to ensure quality and adherence to company standards Provide status updates to stakeholders and escalates issues when necessary. What You Need to Succeed Bachelors degree in Computer Science or related field with 5+ years of enterprise product development experience Good knowledge on algorithm design and analysis, including knowledge of data structures and design patterns Good experience in RDBMS with one of the database: PostgreSQL, Oracle, MSSQL, MySQL or DB2 Good experience in Client side technologies (CSS, HTML, JavaScript, JS frameworks, HTML5, D3.js, FusionCharts) and frameworks Expert in Java, JSP, JavaScript, HTML5, CSS/CSS3, Bootstrap Expert in any of the JavaScript web frameworks (jQuery, AngularJS, React etc.) Firm grasp of lexical scoping, closures, and OO JavaScript Experience in Configuring and working with LDAP and SSO Systems Experience in handling performance and Security related aspects of web applications Experience working with at least one of the following application server is strongly preferred: Tomcat, JBoss, Oracle application server/web logic, web sphere Experience/knowledge of SOAP API and/or Restful API are strongly preferred Contribution to the continual improvement of our agile development processes. Strong hands-on experience with building enterprise applications Knowledge/experience with Spark,Hadoop, Hive, Mongo db is strong plus Knowledge/experience with Docker, Kubernetes is strong plus
Posted 4 days ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
We are looking for a Data Engineer to join the Data Platform team who can help develop and deploy data pipelines at a huge scale of ~5B daily events and concurrency of 500K users. The platform is built on AWS based modern data stack enabling real-time reporting and performing data exploration from first principles. Experience: 2-5 Years Job Location: Gurgaon Responsibilities: ➔ Create and maintain a robust, scalable, and optimized data pipeline. ➔ Handle TBs of data volume daily on Wynk Music/Xstream(Vod & live tv) Data platform ➔ Extract and consume data from our live systems to analyze and produce robust 99.999% SLA Big-data environment ➔ Build and execute data mining and modeling activities using agile development techniques ➔ Solve problems in robust and creative ways and demonstrate solid verbal, interpersonal and written communication skills ➔ Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources. ➔ Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. ➔ Handle multiple kinds of SQL And NoSQL databases for both structured and unstructured datasets. ➔ Appreciate and understand the cloud computation model and how that affects application solutions – both delivery and deployment. ➔ The incumbent will be in the driving seat for innovation and delivery in various exciting tech opportunities, from powering business analytics pipelines to machine learning applications. Desired Profile: ➔ B.E / B.Tech / M.E / M.Tech / M.S in Computer Science or software engineering from a premier institute ➔ 2+ years of experience in Spark and Scala ➔ Should be very strong with data structures and algorithm. ➔ Fluent with Hadoop/Pig/Hive/Storm/SQL ➔ Good to have knowledge: NoSQL solutions like Cassandra/MongoDB/CouchDB, Postgres, Kafka, Redis, ElasticSearch, Spark Streaming, or you really want to learn about them. ➔ Own end to end product modules/features (from the requirement to going live) ➔ Knowledge in at least one of the programming or scripting languages like Java, Scala, Python. ➔ Hands-on in big data infrastructure, distributed systems, dimensional modeling, query processing & optimization and relational databases
Posted 4 days ago
170.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary Responsible for solutions for business requirements, feasible in CSA application using existing components and frameworks Responsible for deliverables of CSA functional requirements. Responsible for reviewing sources of CSL/Front-End modules Key Responsibilities Strategy Responsible for solutions for business requirements, feasible in CSA application using existing components and frameworks Need to participate in Solution discussions in the view of Client experience and functional deliverables. Responsible for managing developers and guide them to have quality deliverables of CSA application. Business Responsible for deliverables of CSA functional requirements. Responsible for providing solution for business requirements Responsible for implementing solution for CSA functional requirements. Processes Responsible for enhancements of development processes of CSA application Responsible for advising required tools which can enhance CSA development process Responsible for reviewing sources of CSL/Front-End modules People & Talent Responsible for guiding CSL/Front-End developers to understand existing CSL/Front-End components Responsible for enforcing coding standards of CSA CSL/Front-End programs Responsibilities Risk Management Responsible to Adhere Group Risk management framework and process for CSL/Front-End development life cycle Responsible to place appropriate controls which helps to eliminate risks while delivering functional deliverables. Governance Responsible to have awareness of Group standards and policies Responsible to provide inputs to developers about Group standards, policies, and regulatory procedures Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead the to achieve the outcomes set out in the Bank’s Conduct Principles: Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Key stakeholders CCIB-Channels Management Team, CCIB-Channels Product Owners. Other Responsibilities Ensure completion of necessary e-learnings, role specific modules etc., Qualifications Should have professional degree in terms of Software engineering (Preferrable Bachelor of Engineering/Master of Computer Application etc.,) Role Specific Technical Competencies As Senior Tech Manager need to work in understanding functional requirements and design the functionality using use-case and sequence diagrams Contribute to product design and establishment of requirements Participate in POC and mentoring junior associates Experience in implementing scalable web applications with fully automated deployment and control using Docker, Kubernetes, AWS, Jenkins etc Experience in Front End Programming using React JS etc., and mentor other resources in Technical upscale Experience in Big Data systems (Hadoop, Apache Spark) About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.
Posted 4 days ago
0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Our Culture & Values: - We’d describe our culture as human, friendly, engaging, supportive, agile, and super collaborative. At Kainskep Solutions, our five values underpin everything we do; from how we work, to how we delight and deliver to our customers. Our values are: - #TeamMember #Ownership #Innovation #Challenge and #Colloboration What makes a great team? A Diverse Team! Don’t be put off if you don’t tick all boxes; we know from research that candidates may not apply if they don’t feel they are 100% there yet; the essential experience we need is the ability to engage clients and build strong, effective relationships. If you don’t tick the rest, we would still love to talk. We’re committed to creating a diverse and inclusive. What you’ll bring: Use programming languages like Python, R, and SQL for data manipulation, statistical analysis, and machine learning tasks. Apply fundamental statistical concepts such as mean, median, variance, probability distributions, and hypothesis testing to analyze data. Develop supervised and unsupervised machine learning models, including classification, regression, clustering, and dimensionality reduction techniques. Evaluate model performance using metrics such as accuracy, precision, recall, and F1-score, implementing cross-validation techniques to ensure reliability. Conduct data manipulation and visualization using libraries such as Pandas, Matplotlib, Seaborn, and ggplot2, implementing data cleaning techniques to handle missing values and outliers. Perform exploratory data analysis, feature engineering, and data mining tasks, including text mining, natural language processing (NLP), and web scraping. Familiarize yourself with big data technologies such as Apache Spark and Hadoop, understanding distributed computing concepts to handle large-scale datasets effectively. Manage relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra) for data storage and retrieval. Use version control systems like Git and GitHub/GitLab for collaborative development, understanding branching, merging, and versioning workflows. Demonstrate basic knowledge of the software development lifecycle, Agile methodologies, algorithms, and data structures. Requirements: Proficiency in programming languages such as Python, R, and SQL. Strong analytical skills and a passion for working with data. Proven experience with generative models like GPT, BERT, DALL·E, Midjourney, GANs, VAEs, etc. Proficiency in Python and deep learning frameworks such as TensorFlow, PyTorch, or JAX . Strong understanding of NLP, computer vision, and transformer architectures . Hands-on experience with Hugging Face Transformers, LangChain, OpenAI API, or similar tools . Prior experience with data analysis, machine learning, or related fields is a plus . Good To Have: Experience in Computer Vision, including Image Processing and Video Processing. Familiarity with Generative AI techniques, such as Generative Adversarial Networks (GANs), and their applications in image, text, and other data generation tasks. Knowledge of Large Language Models (LLMs) is a plus. Experience with Microsoft AI technologies, including Azure AI Studio and Azure Copilot Studio.
Posted 4 days ago
8.0 years
0 Lacs
Thane, Maharashtra, India
On-site
Technical Lead Experience: 8 - 10 Years Ex pSalary : Competitiv ePreferred Notice Perio d: Within 30 Day sOpportunity Type : Hybrid (Mumbai )Placement Type : Permanen t (*Note: This is a requirement for one of Uplers' Client s) Must have skills require d :Ja va OR Spring, A WS OR GC P, Mongo DB OR JavaScr ipt Ripplehire (One of Uplers' Clients) is Looking for:Technical Lead who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Descri ptionWe are looking for a passionate Team Lead with 8+ years of experience to tackle some of the most meaningful problems in the recruitment space. You'll be at the forefront of building AI agents to transform recruitment workflows, designing next-generation candidate experiences, and revolutionizing how enterprises manage their hiring proce sses.As Team Lead, you'll guide development efforts toward successful project delivery while providing technical coaching and mentorship to your team. Your leadership will be crucial in maintaining high standards of software quality and identifying areas for growth and improvement within your team.Your code will have millions of eyeballs from 200 countries right from Azerbaijan to Australia and closer to home, you will help hire from Jhumritalaiya to J ammu. Key Responsibi litiesGuide team development efforts towards successful project de liveryProvide technical leadership to teammates through coaching and ment orshipMaintain high standards of software quality within the team by establishing good practices and habitsIdentify and encourage areas for growth and improvement within th e teamCollaborate with other software developers, business analysts, and software architects to plan, design, develop, test, and maintain web and desktop-based business applic ationsAssist in the collection and documentation of user requirements, development of user stories, estimates, and work plansDesign and implement scalable architecture to support data-intensive applic ationsOversee the deployment of applications to cloud providers like AWS or GCP Requi rements8+ years of experience developing software on a Java/J2EE and relational databas e stackProven experience in leading technical teams and mentoring junior dev elopersStrong expertise in designing scalable architecture to support data-intensive appli cationsMastery of technologies including Spring, Hibernate, SQL, and REST to design microservices-based archi tectureExperience in setting up and deploying applications to cloud providers like AWS or GCPAbility to harness AI as an engineering assistant to improve productivity and code quality Good to haveExperience with frontend development - JavaScript frameworks like Backbone /AngularData science experience - fetching data from multiple sources, modeling, and extracting information with familiarity in tooling, processing, and deployment (MongoDB, Hadoop, Mahout, Neo4 j, etc.)Information security experience - OWASP Security principles and implem entation Our T ech StackJava, Spring, Hibernate, MySQL - RDS, MongoDB, Apache Solr, Spring Cloud, S3 - Angular 2, Backbone JS, Azure OpenAI. Our applications are hosted on AWS and GCP. How to apply for this op portunity:Easy 3-Ste p Process:1. Click On Apply! And Register or log in on our portal2. Upload updated Resume & Complete the Scre ening Form3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client:Ripplehire is a recruitment SaaS for companies to identify correct candidates from employees' social networks and gamify the employee referral program with contests and referral bonus to engage employees in the recruitment process. Developed and managed by Trampoline Tech Private Limited. Recognized by InTech50 as one of the Top 50 innovative enterprise software companies coming out of India and; NHRD (HR Association) Staff Pick for the most innovative social recruiting tool in India. Used by 7 clients as of July 2014. It is a tool available on the subscription-based pri cing model. A bout Uplers:Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in t heir career.(Note: There are many more opportunities apart from this on the portal.)So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are wait ing for you!
Posted 4 days ago
7.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Role** VCP MO Engineer Required Technical Skill Set** Linux, Kubernates , Ansible Location- Hyderabad Desired Competencies (Technical/Behavioral Competency) Must-Have** Experience : 7-10 years Associates should have good communication skills, Experience in L1 & L2 support and in stakeholder management Analytical skills for problem solving Working experience in customer secure location following all the process and security protocols Working experience with Offshore team (India) Should have excellent communication skills and help with customer coordination Good-to-Have Linux Administration Kubernetes and Basic Concepts like ETCD,ISTIO and CNF Hadoop Data Platform Mongo DB Administration Splunk Mesosphere DCOS Knowledge on Containers Ansible RHEV and Virtual Machines SN Responsibility of / Expectations from the Role 1. New installation of software’s in customer environment remotely / physically 2. Implementing / deploying software releases as per plan in customer environment remotely 3. Implementing / deploying changes in software in customer environment remotely 4. Associate should monitor the multiple customer environments with the help of remote monitoring tools 5. Associates should have a thorough knowledge of the SLA’s
Posted 4 days ago
4.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
About This Role Position Overview: BlackRock is seeking a highly skilled and motivated Associate to support its growing and dynamic data governance function. In this role, you will be responsible for ensuring that data products are accurately defined and compliant with data governance policies. You will work closely with cross-functional teams—including business stakeholders and technical teams—to oversee the full lifecycle of data products from ideation to delivery. The ideal candidate will have at least 4 years of experience in data governance and data management or stewardship, and will thrive in a fast-paced, results-driven environment. Key Responsibilities As an Associate Data Steward, your responsibilities will span several key areas: Business & Strategic Acumen: You will collaborate closely with business units to understand evolving data requirements to effectively govern data products to meet strategic goals and objectives. You will ensure that data products support various use cases, such as operational efficiencies, risk management, and commercial applications, while defining success criteria for data offerings in collaboration with key stakeholders. Data Governance & Quality: A core aspect of this role is managing data quality through the application of robust data governance controls. You will be responsible for ensuring risk management and overseeing the framework by which data health can be monitored. You will monitor data quality metrics, and hold stewards accountable so that their products meet established standards for accuracy, completeness, and consistency. Data Product Lifecycle Management: You will oversee the full delivery lifecycle of data products, from ideation to release. This includes working with cross-functional teams—such as product managers, engineers, and business stakeholders—to bake in data governance controls into the planning and design of products. Stakeholder Management: You will work with both internal and external stakeholders to ensure that data products align with organizational goals, address customer needs and meet policy standards. Regular engagement with stakeholders will be key to soliciting feedback on data products and identifying opportunities for enhancement. Collaboration & Communication: You will communicate effectively with both technical and non-technical teams, ensuring that complex data concepts are conveyed clearly. Your collaboration with internal and external teams will ensure that data solutions align with business goals and industry best practices. You will be expected to work in an agile environment, managing multiple priorities to ensure efficient and timely data product delivery. Qualifications & Requirements The ideal candidate will possess the following qualifications: Experience: At least 4 years of experience in data governance, or a related field. Experience in the financial services industry is a plus, but not required. A strong background in data quality management is essential. Technical Skills: Proficiency in data management tools and technologies such as SQL, Python, Jupyter Notebook, Unix, Hadoop, Spark SQL, Tableau, etc. Metadata Management / Data Modeling (CDM/LDM/PDM) and experience with Data Governance tools for data management and quality assurance is preferred. Knowledge of databases (Relational, NoSQL, Graph) and cloud-based data platforms (e.g., Snowflake) is also beneficial. Business & Communication Skills: Strong business acumen and the ability to align data products with both organizational and client needs. You should be able to effectively communicate complex technical concepts to both technical and non-technical stakeholders. Strong organizational skills and the ability to manage multiple tasks and priorities in an agile environment are essential. Education: A bachelor’s degree, preferably in Business, Computer Science, Data Science, or a related field. Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.
Posted 4 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
Remote
🚀 We're Hiring: Senior ETL Tester (QA) – 5+ Years Experience 📍 Location: [GURGAON / Remote / Hybrid] 🕒 Employment Type: Full-Time 💼 Experience: 5+ Years 💰 Salary: [ Based on Experience] 📅 Joining: Immediate --- 🔍 About the Role: We’re looking for a Senior ETL Tester (QA) with 5+ years of strong experience in testing data integration workflows, validating data pipelines, and ensuring data integrity across complex systems. You will play a critical role in guaranteeing that data transformation processes meet performance, accuracy, and compliance requirements. --- ✅ Key Responsibilities: Design, develop, and execute ETL test plans, test scenarios, and SQL queries to validate data quality. Perform source-to-target data validation, transformation logic testing, and reconciliation. Collaborate with data engineers, business analysts, and developers to review requirements and ensure complete test coverage. Identify, document, and manage defects using tools like JIRA, Azure DevOps, or similar. Ensure data quality, completeness, and consistency across large-scale data platforms. Participate in performance testing and optimize data testing frameworks. Maintain and enhance automation scripts for recurring ETL validations (if applicable). --- 💡 Required Skills: 5+ years of hands-on experience in ETL testing and data validation. Strong SQL skills for writing complex queries, joins, aggregations, and data comparisons. Experience working with ETL tools (e.g., Informatica, Talend, DataStage, SSIS). Knowledge of Data Warehousing concepts and Data Modeling. Familiarity with data visualization/reporting tools (e.g., Tableau, Power BI – optional). Experience with Agile/Scrum methodologies. Strong analytical and problem-solving skills. --- ⭐ Nice to Have: Exposure to big data platforms (e.g., Hadoop, Spark). Experience with test automation tools for ETL processes. Cloud data testing experience (AWS, Azure, or GCP). Basic scripting (Python, Shell) for test automation. --- 🙌 Why Join Us? Work with a fast-paced, dynamic team that values innovation and data excellence. Competitive salary, flexible work hours, and growth opportunities. Engage in large-scale, cutting-edge data projects. --- 📩 To Apply: Send your resume to ABHISHEK.RAJ@APPZLOGIC.COM .
Posted 4 days ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
🚀 We're Hiring – Data Engineers | Hybrid Roles 🚀 Join our team of passionate technologists working on cutting-edge data platforms! 🔹 Position 1: Data Engineer 📍 Locations: Noida, Gurgaon, Jaipur, Indore, Hyderabad (Hybrid) 🧠 Experience: 8+ Years 🕒 Shift: 11:00 AM – 8:30 PM 💼 Skills: Azure Architecture, Azure Functions, App Services, Python, CI/CD 🔹 Position 2: Data Engineer / Full Stack Engineer 📍 Locations: Noida, Gurgaon, Hyderabad, Bangalore (Hybrid) 🧠 Experience: 8+ Years 🕒 Shift: 11:00 AM – 8:30 PM 💼 Skills: PySpark, Databricks, ADF, Big Data, Hadoop, Hive If you’re ready to take the next step in your data engineering career, we’d love to connect! 📩 Share your resume at kumar.unnati@cloudstakes.com or DM me directly.
Posted 4 days ago
2.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Education: Bachelor's degree in Computer Science, Data Science, Artificial Intelligence, or a related certifications or experience in NLP & CV. Years of Experience: Minimum 2 years of experience in Deep Learning, NLP,CV, MLOps and its related technologies. Responsibilities: - Design, develop, and deploy state-of-the-art NLP and CV models and algorithm - Collaborate with cross-functional teams to understand requirements and develop customised NLP & CV solutions and have experience in building the python backend using Flask / Django. - Database integration preferably using MongoDB or any other vector databases. - Maintain and improve the performance, accuracy, and efficiency of existing AI/ML models and their deployment on Cloud platforms (AWS) and monitor their performance using MLOps tools such as MLFlow, DVC. - Experience in building End to End data pipelines - Stay updated with emerging AI/ML technologies, LLM’s ,RAG - Conduct regular performance evaluations of AI/ML models using production grade MLOps solutions. - Troubleshoot and resolve any issues arising from the implementation of NLP & CV models. - Develop and monitor the code that runs in production environments using MLOps practices. Requirements: - Strong experience with Deep learning and NLP frameworks such as TensorFlow or other open-source machine learning frameworks. - Experience of using both tensorflow and pytorch frameworks - Proficient in programming languages, such as Python or Java, and experience with AI/ML libraries. - Familiarity with the integration of APIs, such as REST API, OpenAI API, for implementing advanced AI-driven features. - Solid understanding of Machine learning and Deep learning algorithms, concepts, and best practices in a production environment using MLOps. - Experience with big data technologies, such as Hadoop and Spark, is a plus. - Strong problem-solving skills. - Excellent communication and teamwork skills, with the ability to collaborate effectively with team members from various disciplines. - Eagerness to learn and adapt to new technologies and industry trends.
Posted 4 days ago
3.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary Job Description: AWS Data Engineer About the Role We are looking for a highly technical and experienced AWS Data Engineer to join our team. The successful candidate will be responsible for designing, developing, and deploying machine learning models to solve complex business problems by leveraging large datasets on the AWS platform. This role requires working across the entire ML lifecycle, from data collection and preprocessing to model training, evaluation, and deployment using AWS AI Services. The goal is to create efficient self-learning applications capable of evolving over time. If you are passionate about data engineering and machine learning, possess strong programming skills, and have a deep understanding of statistical methods and various ML algorithms, we want to hear from you. Responsibilities Design, develop, and deploy machine learning models on AWS to address complex business challenges. Work across the ML lifecycle, including data collection, preprocessing, model training, evaluation, and deployment using services such as Amazon SageMaker, AWS Glue, and Amazon S3. Leverage large datasets to derive insights and create data-driven solutions using AWS analytics tools. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions on AWS. Optimize and maintain data pipelines and systems on AWS to ensure efficient data processing and storage. Implement and monitor model performance, making necessary adjustments to improve accuracy and efficiency using AWS monitoring tools. Keep up-to-date with the latest advancements in AWS AI and machine learning technologies. Document processes and models to ensure transparency and reproducibility. Preferred Qualifications Bachelor's or Master's degree in Computer Science, Data Science, Statistics, or a related field. Proven experience as a Data Engineer or in a similar role, with a strong focus on machine learning and AWS, ranging from 3 to 8 years of relevant experience. Proficiency in programming languages such as Python, with experience in using AWS SDKs and APIs. Deep understanding of statistical methods and various machine learning algorithms. Experience with AWS AI and ML frameworks and libraries, such as Amazon SageMaker, AWS Glue, and AWS Lambda. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Knowledge of big data technologies and tools, such as Hadoop, Spark, or Kafka, is a plus. Familiarity with AWS cloud platform and services like Amazon EC2, Amazon RDS, and Amazon Redshift is an advantage. Ability to work independently and manage multiple projects simultaneously.
Posted 4 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the Team: The Data Foundations team plays a critical role in supporting Roku Ads business intelligence and analytics . The team is responsible for developing and managing foundational datasets designed to serve the operational and analytical needs of the broader organization. The team's mission is carried out through three focus areas: acting as the interface between data producers and consumers, simplifying data architecture, and creating tools in a standardized way . About the Role: We are seeking a talented and experienced Senior Software Engineer with a strong background in big data technologies, including Apache Spark and Apache Airflow. This hybrid role bridges software and data engineering, requiring expertise in designing, building, and maintaining scalable systems for both application development and data processing. You will collaborate with cross-functional teams to design and manage robust, production-grade, large-scale data systems. The ideal candidate is a proactive self-starter with a deep understanding of high-scale data services and a commitment to excellence. What you’ll be doing Software Development: Write clean, maintainable, and efficient code, ensuring adherence to best practices through code reviews. Big Data Engineering: Design, develop, and maintain data pipelines and ETL workflows using Apache Spark, Apache Airflow. Optimize data storage, retrieval, and processing systems to ensure reliability, scalability, and performance. Develop and fine-tune complex queries and data processing jobs for large-scale datasets. Monitor, troubleshoot, and improve data systems for minimal downtime and maximum efficiency. Collaboration & Mentorship: Partner with data scientists, software engineers, and other teams to deliver integrated, high-quality solutions. Provide technical guidance and mentorship to junior engineers, promoting best practices in data engineering. We’re excited if you have Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience). 5+ years of experience in software and/or data engineering with expertise in big data technologies such as Apache Spark, Apache Airflow and Trino. Strong understanding of SOLID principles and distributed systems architecture. Proven experience in distributed data processing, data warehousing, and real-time data pipelines. Advanced SQL skills, with expertise in query optimization for large datasets. Exceptional problem-solving abilities and the capacity to work independently or collaboratively. Excellent verbal and written communication skills. Experience with cloud platforms such as AWS, GCP, or Azure, and containerization tools like Docker and Kubernetes. (preferred) Familiarity with additional big data technologies, including Hadoop, Kafka, and Presto. (preferred) Strong programming skills in Python, Java, or Scala. (preferred) Knowledge of CI/CD pipelines, DevOps practices, and infrastructure-as-code tools (e.g., Terraform). (preferred) Expertise in data modeling, schema design, and data visualization tools. (preferred) Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms.
Posted 4 days ago
12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Looking for Manager Data Engineering : 12+ years total experince in data engineering and analytics. 2 years of experience in GCP Cloud services such as Big Query, Airflow DAG, Dataflow etc. 2 years of experience in Data Extraction and creating data pipeline workflows on Bigdata(Hive, HQL/Pyspark) with knowledge of Data Engineering concepts. Exposure in analyzing large data sets from multiple data sources, perform validation of data Knowledge of Hadoop Eco-system components like HDFS, Spark, Hive, Sqoop. Experience writing code in Python. Knowledge of SQL/HQL functionalities to write the optimized queries Ability to build a migration plan in collaboration with stakeholders Analytical and problem-solving skills
Posted 4 days ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description About Oracle Analytics & Big Data Service: Oracle Analytics is a complete platform that supports every role within analytics, offering cloud-native services or on-premises solutions without compromising security or governance. Our platform delivers a unified system for managing everything from data collection to decision-making, with seamless integration of AI and machine learning to help businesses accelerate productivity and uncover critical insights. Oracle Big Data Service, a part of Oracle Analytics, is a fully managed, automated cloud service designed to help enterprises create scalable Hadoop-based data lakes. The service work scope encompasses not just good integration with OCI’s native infrastructure (security, cloud, storage, etc.) but also deep integration with other relevant cloud-native services in OCI. It includes doing cloud-native ways of doing service level patching & upgrades and maintaining high availability of the service in the face of random failures & planned downtimes in the underlying infrastructure (e.g., for things like patching the Linux kernels to take care of a security vulnerability). Developing systems for monitoring and getting telemetry into the service’s runtime characteristics and being able to take actions on the telemetry data is a part of the charter. We are interested in experienced engineers with expertise and passion for solving difficult problems in distributed systems and highly available services to join our Oracle Big Data Service team. In this role, you will be instrumental in building, maintaining, and enhancing our managed, cloud-native Big Data service focused on large-scale data processing and analytics. At Oracle, you can help, shape, design, and build innovative new systems from the ground up. These are exciting times in our space - we are growing fast, still at an early stage, and working on ambitious new initiatives. Engineers at any level can have significant technical and business impact. Minimum Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. Minimum of 3-5 years of experience in software development, with a focus on large-scale distributed systems, cloud services, or Big Data technologies. US passport holders. This is required by the position to access US Gov regions. Expertise in coding in Java, Python with emphasis on tuning/optimization Experience with Linux systems administration, troubleshooting, and security best practices in cloud environments. Familiarity with containerization and orchestration technologies like Docker and Kubernetes. Experience with automation tools and techniques for deployment, patching, and service monitoring. Experience with open-source software in the Big Data ecosystem Experience at an organization with operational/dev-ops culture Solid understanding of networking, storage, and security components related to cloud infrastructure. Solid foundation in data structures, algorithms, and software design with strong analytical and debugging skills. Preferred Qualifications: Proven expertise in cloud-native architectures and services, preferably within Oracle Cloud Infrastructure (OCI), AWS, Azure, or GCP. Deep understanding of Java and JVM mechanics Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, YARN), Spark, Kafka, Flink and other big data technologies is a plus. Excellent problem-solving skills and the ability to work in a fast-paced, agile environment. Responsibilities Key Responsibilities: Participate in the design, development, and maintenance of a scalable and secure Hadoop-based data lake service. Code, integrate, and operationalize open and closed source data ecosystem components for Oracle cloud service offerings Collaborate with cross-functional teams including DevOps, Security, and Product Management to define and execute product roadmaps, service updates, and feature enhancements. Becoming an active member of the Apache open source community when working on open source components Implement high availability strategies and design patterns to manage system failures and ensure uninterrupted service during planned or unplanned infrastructure downtimes. Ensure compliance with security protocols and industry best practices when handling large-scale data processing in the cloud. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 4 days ago
1.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description About Oracle Analytics & Big Data Service: Oracle Analytics is a complete platform that supports every role within analytics, offering cloud-native services or on-premises solutions without compromising security or governance. Our platform delivers a unified system for managing everything from data collection to decision-making, with seamless integration of AI and machine learning to help businesses accelerate productivity and uncover critical insights. Oracle Big Data Service, a part of Oracle Analytics, is a fully managed, automated cloud service designed to help enterprises create scalable Hadoop-based data lakes. The service work scope encompasses not just good integration with OCI’s native infrastructure (security, cloud, storage, etc.) but also deep integration with other relevant cloud-native services in OCI. It includes doing cloud-native ways of doing service level patching & upgrades and maintaining high availability of the service in the face of random failures & planned downtimes in the underlying infrastructure (e.g., for things like patching the Linux kernels to take care of a security vulnerability). Developing systems for monitoring and getting telemetry into the service’s runtime characteristics and being able to take actions on the telemetry data is a part of the charter. We are interested in experienced engineers with expertise and passion for solving difficult problems in distributed systems and highly available services to join our Oracle Big Data Service team. In this role, you will be instrumental in building, maintaining, and enhancing our managed, cloud-native Big Data service focused on large-scale data processing and analytics. At Oracle, you can help, shape, design, and build innovative new systems from the ground up. These are exciting times in our space - we are growing fast, still at an early stage, and working on ambitious new initiatives. Engineers at any level can have significant technical and business impact. Minimum Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. Minimum of 1-2 years of experience in software development, with a focus on large-scale distributed systems, cloud services, or Big Data technologies. US passport holders. This is required by the position to access US Gov regions. Expertise in coding in Java, Python with emphasis on tuning/optimization Experience with Linux systems administration, troubleshooting, and security best practices in cloud environments. Experience with open-source software in the Big Data ecosystem Experience at an organization with operational/dev-ops culture Solid understanding of networking, storage, and security components related to cloud infrastructure. Solid foundation in data structures, algorithms, and software design with strong analytical and debugging skills. Preferred Qualifications: Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, YARN), Spark, Kafka, Flink and other big data technologies. Proven expertise in cloud-native architectures and services, preferably within Oracle Cloud Infrastructure (OCI), AWS, Azure, or GCP. In-depth understanding of Java and JVM mechanics Good problem-solving skills and the ability to work in a fast-paced, agile environment. Responsibilities Key Responsibilities: Participate in development and maintenance of a scalable and secure Hadoop-based data lake service. Code, integrate, and operationalize open and closed source data ecosystem components for Oracle cloud service offerings Collaborate with cross-functional teams including DevOps, Security, and Product Management to define and execute product roadmaps, service updates, and feature enhancements. Becoming an active member of the Apache open source community when working on open source components Ensure compliance with security protocols and industry best practices when handling large-scale data processing in the cloud. Qualifications Career Level - IC2 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 4 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job descriptions : Business Analyst Key Responsibilities Business Deeply understand business goals and strategies, as well as operational context and processes Gather, analyze, document, and validate the needs of business stakeholders.Document business user requirements Write user stories, manage scrum calls, prioritize backlog. Provide regular, timely, transparent communication on progress to capabilitiesleadership teams. Ensure business solutions are deployed, fully embedded into ways of working, andconstantly improved to address changing business needs and transform andmature our business. Work closely with data asset and employee experience teams for report builds. Technical Identify requirements (if not already provided) and business functionality and KPIs, identify source and target tables. Perform a high-level analysis of the reports to be built, maintain design documentation and user stories. Responsible for identifying and curating L2 data assets for OneS4 reporting. Check if these entities are available in Digital Core Sometimes the entities might be available, but this source might not be curated. Sometimes the entities might be available, but it might be lacking the necessarycolumns. At times the entity might have to be curated as a new data asset Using the above high-level assessment, provide Documentation of the curation requirements possibly with support fromBusiness Analysts/Developers [code analysis] Work with data asset teams to model the data requirements into existing/newdata assets Work with Business Owners for UAT Perform adhoc analysis for capabilities as needed Qualifications Business Skills Minimum of a Bachelors degree is required. Specialization in supply chain with experience in data engineering, data science or advanced analytics fields is strongly preferred. Minimum of 5 years of business experience with at least 2 years of Supply Chain, IT or other relevant functional experience required. Outstanding stakeholder management experience, superior influencing, andcollaboration skills are required. Strong written and verbal communication skills are required Strong understanding and working experience utilizing design-thinking bestpractices empathy, requirements definition, ideation, prototyping, and testing todeliver digital products and experiences is required. Ability to work with both technical and non-technical team members is required. Strong critical thinking, with outstanding analytical, influencing, and interpersonalskills are required. Experience working in or with a complex global organization spanning across functions is preferred. Project Management Experience with Agile product development methodology is required. Maintaining Jira boards and epics Experience gathering, analyzing, documenting, and validating the needs of thebusiness stakeholders is required. Experience writing user stories, managing scrum calls, prioritizing backlog, etc. is required. Technical Skills Experience in working with Tableau, Alteryx, SQL, or other data and analytics tools is preferred. Experience in SQL [legacy Hadoop analysis] +Databricks/Synapse predominantly and Azure stack [minimum ability to set up and do basic querying]. Strong digital product management experience including experience in building /managing data and analytics platforms, data science / advanced analytics /intelligent automation solutions is preferred. Scrum Alliance certification is preferred (e.g., Product Owner, Developer, ScrumMaster), Project Planning and JIRA Job Type : Payroll Must Have Skills SQL - 3 Years - Intermediate Business Analytics - 6 Years Analytical Skills - 3 Years - Intermediate Power BI - 3 Years - Intermediate Supply chain Management - 3 Years - Intermediate (ref:hirist.tech)
Posted 4 days ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Devops Engineer : Bangalore Job Description DevOps Engineer_Qilin Lab Bangalore, India Role We are seeking an experienced DevOps Engineer to deliver insights from massive-scale data in real time. Specifically, were searching for someone who has fresh ideas and a unique viewpoint, and who enjoys collaborating with a cross-functional team to develop real-world solutions and positive user experiences for every of this role : Work with DevOps to run the production environment by monitoring availability and taking a holistic view of system health Build software and systems to manage our Data Platform infrastructure Improve reliability, quality, and time-to-market of our Global Data Platform Measure and optimize system performance and innovate for continual improvement Provide operational support and engineering for a distributed Platform at : Define, publish and defend service-level objectives (SLOs) Partner with data engineers to improve services through rigorous testing and release procedures Participate in system design, Platform management and capacity planning Create sustainable systems and services through automation and automated run-books Proactive approach to identifying problems and seeking areas for improvement Mentor the team in infrastructure best : Bachelors degree in Computer Science or an IT related field, or equivalent practical experience with a proven track record. The following hands-on working knowledge and experience is required : Kubernetes , EC2 , RDS,ELK Stack, Cloud Platforms (AWS, Azure, GCP) preferably AWS. Building & operating clusters Related technologies such as Containers, Helm, Kustomize, Argocd Ability to program (structured and OOP) using at least one high-level language such as Python, Java, Go, etc. Agile Methodologies (Scrum, TDD, BDD, etc.) Continuous Integration and Continuous Delivery Tools (gitops) Terraform, Unix/Linux environments Experience with several of the following tools/technologies is desirable : Big Data platforms (eg. Apache Hadoop and Apache Spark)Streaming Technologies (Kafka, Kinesis, etc.) ElasticSearch Service, Mesh Orchestration technologies, e.g., Argo Knowledge of the following is a plus : Security (OWASP, SIEM, etc.)Infrastructure testing (Chaos, Load, Security), Github, Microservices architectures. Notice period : Immediate to 15 days Experience : 3 to 5 years Job Type : Full-time Schedule : Day shift Monday to Friday Work Location : On Site Job Type : Payroll Must Have Skills Python - 3 Years - Intermediate DevOps - 3 Years - Intermediate AWS - 2 Years - Intermediate Agile Methodology - 3 Years - Intermediate Kubernetes - 3 Years - Intermediate ElasticSearch - 3 Years - Intermediate (ref:hirist.tech)
Posted 4 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description About Oracle Analytics & Big Data Service: Oracle Analytics is a complete platform that supports every role within analytics, offering cloud-native services or on-premises solutions without compromising security or governance. Our platform delivers a unified system for managing everything from data collection to decision-making, with seamless integration of AI and machine learning to help businesses accelerate productivity and uncover critical insights. Oracle Big Data Service, a part of Oracle Analytics, is a fully managed, automated cloud service designed to help enterprises create scalable Hadoop-based data lakes. The service work scope encompasses not just good integration with OCI’s native infrastructure (security, cloud, storage, etc.) but also deep integration with other relevant cloud-native services in OCI. It includes doing cloud-native ways of doing service level patching & upgrades and maintaining high availability of the service in the face of random failures & planned downtimes in the underlying infrastructure (e.g., for things like patching the Linux kernels to take care of a security vulnerability). Developing systems for monitoring and getting telemetry into the service’s runtime characteristics and being able to take actions on the telemetry data is a part of the charter. We are interested in experienced engineers with expertise and passion for solving difficult problems in distributed systems and highly available services to join our Oracle Big Data Service team. In this role, you will be instrumental in building, maintaining, and enhancing our managed, cloud-native Big Data service focused on large-scale data processing and analytics. At Oracle, you can help, shape, design, and build innovative new systems from the ground up. These are exciting times in our space - we are growing fast, still at an early stage, and working on ambitious new initiatives. Engineers at any level can have significant technical and business impact. Minimum Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. Minimum of 3-5 years of experience in software development, with a focus on large-scale distributed systems, cloud services, or Big Data technologies. US passport holders. This is required by the position to access US Gov regions. Expertise in coding in Java, Python with emphasis on tuning/optimization Experience with Linux systems administration, troubleshooting, and security best practices in cloud environments. Familiarity with containerization and orchestration technologies like Docker and Kubernetes. Experience with automation tools and techniques for deployment, patching, and service monitoring. Experience with open-source software in the Big Data ecosystem Experience at an organization with operational/dev-ops culture Solid understanding of networking, storage, and security components related to cloud infrastructure. Solid foundation in data structures, algorithms, and software design with strong analytical and debugging skills. Preferred Qualifications: Proven expertise in cloud-native architectures and services, preferably within Oracle Cloud Infrastructure (OCI), AWS, Azure, or GCP. Deep understanding of Java and JVM mechanics Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, YARN), Spark, Kafka, Flink and other big data technologies is a plus. Excellent problem-solving skills and the ability to work in a fast-paced, agile environment. Responsibilities Key Responsibilities: Participate in the design, development, and maintenance of a scalable and secure Hadoop-based data lake service. Code, integrate, and operationalize open and closed source data ecosystem components for Oracle cloud service offerings Collaborate with cross-functional teams including DevOps, Security, and Product Management to define and execute product roadmaps, service updates, and feature enhancements. Becoming an active member of the Apache open source community when working on open source components Implement high availability strategies and design patterns to manage system failures and ensure uninterrupted service during planned or unplanned infrastructure downtimes. Ensure compliance with security protocols and industry best practices when handling large-scale data processing in the cloud. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 4 days ago
1.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description About Oracle Analytics & Big Data Service: Oracle Analytics is a complete platform that supports every role within analytics, offering cloud-native services or on-premises solutions without compromising security or governance. Our platform delivers a unified system for managing everything from data collection to decision-making, with seamless integration of AI and machine learning to help businesses accelerate productivity and uncover critical insights. Oracle Big Data Service, a part of Oracle Analytics, is a fully managed, automated cloud service designed to help enterprises create scalable Hadoop-based data lakes. The service work scope encompasses not just good integration with OCI’s native infrastructure (security, cloud, storage, etc.) but also deep integration with other relevant cloud-native services in OCI. It includes doing cloud-native ways of doing service level patching & upgrades and maintaining high availability of the service in the face of random failures & planned downtimes in the underlying infrastructure (e.g., for things like patching the Linux kernels to take care of a security vulnerability). Developing systems for monitoring and getting telemetry into the service’s runtime characteristics and being able to take actions on the telemetry data is a part of the charter. We are interested in experienced engineers with expertise and passion for solving difficult problems in distributed systems and highly available services to join our Oracle Big Data Service team. In this role, you will be instrumental in building, maintaining, and enhancing our managed, cloud-native Big Data service focused on large-scale data processing and analytics. At Oracle, you can help, shape, design, and build innovative new systems from the ground up. These are exciting times in our space - we are growing fast, still at an early stage, and working on ambitious new initiatives. Engineers at any level can have significant technical and business impact. Minimum Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. Minimum of 1-2 years of experience in software development, with a focus on large-scale distributed systems, cloud services, or Big Data technologies. US passport holders. This is required by the position to access US Gov regions. Expertise in coding in Java, Python with emphasis on tuning/optimization Experience with Linux systems administration, troubleshooting, and security best practices in cloud environments. Experience with open-source software in the Big Data ecosystem Experience at an organization with operational/dev-ops culture Solid understanding of networking, storage, and security components related to cloud infrastructure. Solid foundation in data structures, algorithms, and software design with strong analytical and debugging skills. Preferred Qualifications: Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, YARN), Spark, Kafka, Flink and other big data technologies. Proven expertise in cloud-native architectures and services, preferably within Oracle Cloud Infrastructure (OCI), AWS, Azure, or GCP. In-depth understanding of Java and JVM mechanics Good problem-solving skills and the ability to work in a fast-paced, agile environment. Responsibilities Key Responsibilities: Participate in development and maintenance of a scalable and secure Hadoop-based data lake service. Code, integrate, and operationalize open and closed source data ecosystem components for Oracle cloud service offerings Collaborate with cross-functional teams including DevOps, Security, and Product Management to define and execute product roadmaps, service updates, and feature enhancements. Becoming an active member of the Apache open source community when working on open source components Ensure compliance with security protocols and industry best practices when handling large-scale data processing in the cloud. Qualifications Career Level - IC2 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31300 Jobs | Dublin
Wipro
16502 Jobs | Bengaluru
EY
10539 Jobs | London
Accenture in India
10399 Jobs | Dublin 2
Uplers
8481 Jobs | Ahmedabad
Amazon
8475 Jobs | Seattle,WA
IBM
7957 Jobs | Armonk
Oracle
7438 Jobs | Redwood City
Muthoot FinCorp (MFL)
6169 Jobs | New Delhi
Capgemini
5811 Jobs | Paris,France