Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
11 - 20 Lacs
Bengaluru
Work from Office
Role & responsibilities We are seeking a skilled Data Engineer to maintain robust data infrastructure and pipelines that support our operational analytics and business intelligence needs. Candidates will bridge the gap between data engineering and operations, ensuring reliable, scalable, and efficient data systems that enable data-driven decision making across the organization. Strong proficiency in Spark SQL, hands-on experience with realtime Kafka, Flink Databases: Strong knowledge of relational databases (Oracle, MySQL) and NoSQL systems Proficiency with Version Control Git, CI/CD practices and collaborative development workflow Strong operations management and stakeholder communication skills Flexibility to work cross time zone Have cross-cultural communication mindset Experience working in cross-functional teams Continuous learning mindset and adaptability to new technologies Preferred candidate profile Bachelor's degree in Computer Science, Engineering, Mathematics, or related field 3+ years of experience in data engineering, software engineering, or related role Proven experience building and maintaining production data pipelines Expertise in Hadoop ecosystem - Spark SQL, Iceberg, Hive etc. Extensive experience with Apache Kafka, Apache Flink, and other relevant streaming technologies. Orchestrating tools - Apache Airflow & UC4, Proficiency in Python, Unix or similar languages Good understanding of SQL, oracle, SQL server, Nosql or similar languages Proficiency with Version Control Git, CI/CD practices and collaborative development workflows Preferrable immeidate joiner to less than 30days np
Posted 16 hours ago
10.0 - 15.0 years
35 - 40 Lacs
Bengaluru
Work from Office
Description: We are looking for a highly skilled and experienced Over-the-Top (OTT) Subject Matter Expert (SME) to join our dynamic team. In this role, the OTT SME will provide strategic and technical leadership across all facets of our OTT video platform, ensuring the reliable and high-quality delivery of content to our audience. The ideal candidate will have deep expertise in OTT technologies, a strong grasp of industry trends, and a proven ability to apply best practices to drive platform performance and innovation. Requirements: Bachelor’s degree in Computer Science, Engineering, or a related field. 7+ years of hands-on experience in OTT video streaming, with in-depth knowledge of OTT technologies, platforms, and workflows. Proven experience across both frontend and backend OTT ecosystems. Strong understanding of video encoding, transcoding, packaging, and delivery formats (e.g., HLS, DASH, CMAF). Proficiency with OTT video players and SDKs (e.g., JW Player, THEOplayer, ExoPlayer). Experience with cloud-based video streaming services (e.g., AWS Media Services, Azure Media Services, Google Cloud Media CDN). Solid understanding of content delivery networks (CDNs) and streaming protocols. Experience with digital rights management (DRM) technologies such as Widevine, PlayReady, and FairPlay. Knowledge of video advertising integration, tracking, and monitoring. Hands-on experience developing video applications for mobile platforms, browsers, set-top boxes (STBs), and Smart TVs. Strong knowledge of OTT backend systems, including CMS, CDN, billing, ingestion, personalization, and user management. Familiarity with digital content rights, licensing, and restrictions management. Excellent analytical and problem-solving skills, with the ability to troubleshoot complex streaming and platform issues. Strong verbal and written communication skills, with the ability to work effectively across cross-functional teams. Self-motivated with the ability to manage multiple priorities and projects independently. Job Responsibilities: Act as the primary Subject Matter Expert (SME) or Architect for OTT technologies, platforms, and industry trends. Provide expert technical guidance and support to cross-functional teams, including engineering, product management, and operations. Design, implement, and optimize end-to-end OTT video workflows, encompassing encoding, transcoding, packaging, and content delivery. Troubleshoot and resolve complex technical issues related to OTT video streaming and platform performance. Evaluate emerging OTT technologies and make strategic recommendations to enhance platform scalability, reliability, and user experience. Develop and maintain comprehensive technical documentation, including architecture diagrams, specifications, and standard operating procedures. Monitor OTT platform performance, identify bottlenecks or inefficiencies, and drive continuous improvement initiatives. Stay current with evolving industry standards, protocols (e.g., HLS, DASH), and best practices in OTT streaming. Collaborate with third-party vendors and technology partners to integrate new services and innovations into the platform. Contribute to the development of product roadmaps and long-term strategic planning for OTT initiatives. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 18 hours ago
0.0 - 2.0 years
2 - 4 Lacs
Chennai
Work from Office
Set up, test, and configure broadcast equipment such as switchers, routers, encoders, multiviewers, and IT systems. Signal ManagementMaintenance & TroubleshootingBroadcast Network & IT Support. Manage media storage, NAS/SAN, FTP
Posted 3 days ago
5.0 - 7.0 years
11 - 13 Lacs
Pune
Work from Office
Key Responsibilities: - Design and develop application interfaces, business logic, and data integrations within VBCS. - Implement responsive design principles to ensure applications are optimized for various devices and screen sizes. - Troubleshoot and resolve technical issues and bugs. - Monitor system performance and user feedback to propose necessary adjustments or enhancements. - Extend standard Oracle pages, design, and build bolt-on applications using VBCS. - Optimize VBCS extensions to work with large datasets for CRUD operations. - Use VBCS with OCI PaaS components like Oracle Integration Cloud, ATP/ADW, Streaming, and OCI functions to design scalable applications. - Interact with Oracle Fusion REST/SOAP services to create/update entries in Oracle Fusion through the VBCS user interface. - Migrate and deploy VBCS extensions across instances. Requirements: - Bachelors degree in Computer Science, Engineering, or a related field. - Minimum of 5 years of Oracle VBCS design and development experience. - Strong understanding of Oracle cloud technologies with hands-on knowledge of VBCS. - Excellent analytical and problem-solving skills. - Strong knowledge of JavaScript, HTML, CSS, and RESTful APIs. - Familiarity with Agile development methodologies and version control systems (e.g., Git). - Strong communication and interpersonal skills for effective collaboration with cross-functional teams. Preferred Qualifications: - Knowledge of Oracle Cloud Infrastructure and other Oracle Cloud Services. - Previous experience in developing applications and extending SaaS applications for enterprise environments.
Posted 3 days ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have 3-8 years of experience and possess strong knowledge of Java for Android and Objective C for iOS. Exposure to multiple platforms would be preferred. Proficiency in SCM tools like SVN, GIT, Maven, Jenkins, and familiarity with best practices and continuous integration is required. Additionally, experience with client testing frameworks such as Selenium and Selendroid is essential. It is important to have exposure to multimedia content delivery protocols like HLS, Smooth Streaming, and Download. Familiarity with media players on various mobile platforms and knowledge of performance improvement techniques for mobile devices are necessary. You should understand the difference between mobile product libraries, mobile applications, and client-specific extensions. As part of the role, you will be required to create test plans based on documentation or discussions with developers and architects. Implementing test suites, integrating them into existing frameworks using development best practices, and executing test suites will be part of your responsibilities. You should be able to provide initial diagnosis of issue root cause and communicate test status updates to Project Management. Participation in Agile sprint planning sessions and providing constructive feedback is expected. You will also collaborate with the team in high-level estimation tasks. Being an easy-going and flexible team player with good communication skills, a creative mindset, and problem-solving abilities is essential for this role.,
Posted 5 days ago
5.0 - 8.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Strong experience working with the Apache Spark framework, including a solid grasp of core concepts, performance optimizations, and industry best practices Proficient in PySpark with hands-on coding experience; familiarity with unit testing, object-oriented programming (OOP) principles, and software design patterns Experience with code deployment and associated processes Proven ability to write complex SQL queries to extract business-critical insights Hands-on experience in streaming data processing Familiarity with machine learning concepts is an added advantage Experience with NoSQL databases Good understanding of Test-Driven Development (TDD) methodologies Demonstrated flexibility and eagerness to learn new technologies Roles and Responsibilities Design and implement solutions for problems arising out of large-scale data processing Attend/drive various architectural, design and status calls with multiple stakeholders Ensure end-to-end ownership of all tasks being aligned including development, testing, deployment and support Design, build & maintain efficient, reusable & reliable code Test implementation, troubleshoot & correct problems Capable of working as an individual contributor and within team too Ensure high quality software development with complete documentation and traceability Fulfil organizational responsibilities (sharing knowledge & experience with other teams/ groups)
Posted 5 days ago
0.0 - 3.0 years
1 - 3 Lacs
Chennai
Work from Office
Job description Experience: 1year - 3 years Job location: Chennai Roles & Responsibility: Hands-on experience with the audio/video ecosystem, particularly encoding and transcoding etc. Experience in configuration, troubleshooting, and maintenance of Live TV/VoD encoders. Strong understanding of networking concepts including LAN/WAN architecture, TCP/IP, UDP, DNS, and load balancing etc. Working knowledge in the Linux environments (preferred). Familiarity with emerging technologies in the video domain. If anyone interested, please share your updated resume to this mail id priyanka.d@sunnetwork.in
Posted 6 days ago
3.0 - 8.0 years
10 - 14 Lacs
Hyderabad
Work from Office
We are seeking a forward-thinking Software Engineer with expertise in Java Fullstack and JavaScript to join our dynamic team. The candidate will have the opportunity to develop cutting-edge applications, implement scalable microservices, and work on innovative technologies in cloud environments. Responsibilities Develop applications utilizing Java Implement microservice architecture patterns on Microsoft Azure Apply Apache Kafka for event streaming Leverage Docker and Kubernetes for containerization and microservice scaling Work with REST APIs and SQL databases Requirements 3 to 5 years of professional experience Strong technical proficiency in Java and microservices architecture Background in JavaScript and its frameworks Knowledge of Apache Kafka and container technologies like Docker and Kubernetes Expertise in Maven/Gradle and Microsoft Azure Proficiency in REST API, SQL, Spring Core, and data structure/algorithm problem-solving B2-level English communication skills
Posted 1 week ago
5.0 - 8.0 years
16 - 24 Lacs
Hyderabad
Work from Office
We are looking for an experienced Software Engineer to join our team and take ownership of designing and developing scalable, high-performance applications using Java Fullstack and JavaScript. This role will involve leveraging cutting-edge tools and technologies to build and maintain microservices on Microsoft Azure, ensuring seamless event streaming with Apache Kafka, and implementing containerisation with Docker and Kubernetes. Responsibilities Build robust applications using Java and JavaScript Design and implement microservice architecture patterns on Microsoft Azure Leverage Apache Kafka for efficient event streaming pipelines Deploy and manage microservices using Docker and Kubernetes for scalability Collaborate on REST API design and integration with SQL databases Optimize application performance by applying problem-solving skills Ensure code quality and maintainability through best practices in development Contribute to architectural decisions supporting long-term scalability and reliability Lead and mentor team members to enhance overall technical capabilities Requirements 5 to 9 years of experience in software engineering roles Strong proficiency in Java and Microservice architecture Expertise in JavaScript and related frameworks Knowledge of Apache Kafka, Docker, and Kubernetes for containerisation and event streaming Competency in Maven/Gradle, REST API, SQL databases, Spring Core, and Microsoft Azure Solid problem-solving skills and understanding of data structures and algorithms B2-level English proficiency with strong communication skills
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a member of the JM Financial team, you will be part of a culture that values recognition and rewards for the hard work and dedication of its employees. We believe that a motivated workforce is essential for the growth of our organization. Our management team acknowledges and appreciates the efforts of our personnel through promotions, bonuses, awards, and public recognition. By fostering an atmosphere of success, we celebrate achievements such as successful deals, good client ratings, and customer reviews. Nurturing talent is a key focus at JM Financial. We aim to prepare our employees for future leadership roles by creating succession plans and encouraging direct interactions with clients. Knowledge sharing and cross-functional interactions are integral to our business environment, fostering inclusivity and growth opportunities for our team members. Attracting and managing top talent is a priority for JM Financial. We have successfully built a diverse talent pool with expertise, new perspectives, and enthusiasm. Our strong brand presence in the market enables us to leverage the expertise of our business partners to attract the best talent. Trust is fundamental to our organization, binding our programs, people, and clients together. We prioritize transparency, two-way communication, and trust across all levels of the organization. Opportunities for growth and development are abundant at JM Financial. We believe in growing alongside our employees and providing them with opportunities to advance their careers. Our commitment to nurturing talent has led to the appointment of promising employees to leadership positions within the organization. With a focus on employee retention and a supportive environment for skill development, we aim to create a strong future leadership team. Emphasizing teamwork, we value both individual performance and collaborative group efforts. In a fast-paced corporate environment, teamwork is essential for achieving our common vision. By fostering open communication channels and facilitating information sharing, we ensure that every member of our team contributes to delivering value to our clients. As a Java Developer at JM Financial, your responsibilities will include designing, modeling, and building services to support new features and products. You will work on an integrated central platform to power various web applications, developing a robust backend framework and implementing features across different products using a combination of technologies. Researching and implementing new technologies to enhance our services will be a key part of your role. To excel in this position, you should have a BTech Degree in Computer Science or equivalent experience, with at least 3 years of experience building Java-based web applications in Linux/Unix environments. Proficiency in scripting languages such as JavaScript, Ruby, or Python, along with compiled languages like Java or C/C++, is required. Experience with Google Cloud Platform services, knowledge of design methodologies for backend services, and building scalable infrastructure are essential skills for this role. Our technology stack includes JavaScript, Angular, React, NextJS, HTML5/CSS3/Bootstrap, Windows/Linux/OSX Bash, Kookoo telephony, SMS Gupshup, Sendgrid, Optimizely, Mixpanel, Google Analytics, Firebase, Git, Bash, NPM, Browser Dev Console, NoSQL, Google Cloud Datastore, Google Cloud Platform (App Engine, PubSub, Cloud Functions, Bigtable, Cloud Endpoints). If you are passionate about technology and innovation, and thrive in a collaborative environment, we welcome you to join our team at JM Financial.,
Posted 1 week ago
2.0 - 24.0 years
0 - 0 Lacs
noida, uttar pradesh
On-site
As an Assistant Product Manager at our organization, you will play a crucial role in supporting the development and enhancement of features on our OTT platform. Your responsibilities will primarily revolve around analytics, user experience, content delivery, and performance tracking within the OTT ecosystem. Collaborating with technical and business teams, you will drive product modules from ideation to release and analyze user data to improve key performance indicators related to user engagement and content consumption. Working closely with BI and data engineering teams, you will ensure accurate data pipelines and actionable insights. You will also coordinate with engineering and QA teams to test features, track issues, and facilitate smooth product releases. Gathering post-deployment feedback will be essential for assessing feature performance and identifying areas for improvement. Maintaining up-to-date product documentation, user stories, and requirement specs will also be part of your responsibilities, along with tracking tasks, bugs, and enhancements using Agile tools like JIRA or ClickUp. To excel in this role, you should hold a Bachelor's degree in Computer Science, Engineering, or Information Technology, with at least 2 years of experience in product operations, analytics, or product support, preferably within OTT, streaming, or SaaS platforms. Proficiency in analytics and reporting tools such as Google Analytics, Mixpanel, or Tableau, as well as hands-on experience with SQL for database querying, is required. Stakeholder management skills, familiarity with Agile tools, and a data-driven mindset are essential. Strong organizational, communication, and problem-solving skills will be beneficial in this role. Additionally, bonus points will be awarded for experience in data pipelines, ETL processes, or product instrumentation for analytics, as well as understanding of OTT technology components like CDNs, video transcoding, cloud storage, and media asset management systems. Basic knowledge of API interactions, client-server architecture, and performance monitoring tools will also be advantageous. This is a full-time, permanent position based in Noida, Uttar Pradesh. The work schedule is during day shifts, Monday to Friday. If you are ready to bring your expertise in product management and analytics to our dynamic team, we look forward to receiving your application.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
You should possess a Bachelors/Master's degree in Computer Science/Computer Engineering or a related field. Along with this, you must have at least 2-6 years of experience in server-side development using languages like GoLang, Node.JS, or Python. Furthermore, it is essential to have proficiency in AWS services such as Lambda, DynamoDB, Step Functions, S3, etc. and hands-on experience in deploying and managing Serverless service environments. Experience with Docker, containerization, and Kubernetes is also required for this role. In addition, knowledge of database technologies like MongoDB and DynamoDB, along with experience in CI/CD pipeline and automation, would be beneficial. Experience in Video Transcoding/Streaming on Cloud would be considered a plus. Lastly, strong problem-solving skills are a must-have for this position.,
Posted 2 weeks ago
5.0 - 7.0 years
13 - 18 Lacs
Hyderabad, Bengaluru
Work from Office
• Help build the backend services that power the playback exp. for millions of subscribers around the world • Collaborate with other software engineers & product teams to ensure successful implementation of software solutions to meet our primary goal Required Candidate profile • You have 5+ Yrs. of exp. crafting software solutions with a track record for developing solutions used globally by millions of users. You have expertise in video streaming & DRM technologies.
Posted 2 weeks ago
3.0 - 10.0 years
18 - 22 Lacs
Hyderabad
Work from Office
WHAT YOU'LL DO Lead the development of scalable data infrastructure solutions Leverage your data engineering expertise to support data stakeholders and mentor less experienced Data Engineers. Design and optimize new and existing data pipelines Collaborate with a cross-functional product engineering teams and data stakeholders deliver on Codecademy’s data needs WHAT YOU'LL NEED 8 to 10 years of hands-on experience building and maintaining large scale ETL systems Deep understanding of database design and data structures. SQL, & NoSQL. Fluency in Python. Experience working with cloud-based data platforms (we use AWS) SQL and data warehousing skills -- able to write clean and efficient queries Ability to make pragmatic engineering decisions in a short amount of time Strong project management skills; a proven ability to gather and translate requirements from stakeholders across functions and teams into tangible results WHAT WILL MAKE YOU STAND OUT Experience with tools in our current data stack: Apache Airflow, Snowflake, dbt, FastAPI, S3, & Looker. Experience with Kafka, Kafka Connect, and Spark or other data streaming technologies Familiarity with the database technologies we use in production: Snowflake, Postgres, and MongoDB. Comfort with containerization technologies: Docker, Kubernetes, etc.
Posted 2 weeks ago
2.0 - 7.0 years
12 - 16 Lacs
Hyderabad
Work from Office
WHAT YOU'LL DO Build scalable data infrastructure solutions Design and optimize new and existing data pipelines Integrate new data sources into our existing data architecture Collaborate with a cross-functional product engineering teams and data stakeholders deliver on Codecademy’s data needs WHAT YOU'LL NEED 3 to 5 years of hands-on experience building and maintaining large scale ETL systems Deep understanding of database design and data structures. SQL, & NoSQL. Fluency in Python. Experience working with cloud-based data platforms (we use AWS) SQL and data warehousing skills -- able to write clean and efficient queries Ability to make pragmatic engineering decisions in a short amount of time Strong project management skills; a proven ability to gather and translate requirements from stakeholders across functions and teams into tangible results WHAT WILL MAKE YOU STAND OUT Experience with tools in our current data stack: Apache Airflow, Snowflake, dbt, FastAPI, S3, & Looker. Experience with Kafka, Kafka Connect, and Spark or other data streaming technologies Familiarity with the database technologies we use in production: Snowflake, Postgres, and MongoDB. Comfort with containerization technologies: Docker, Kubernetes, etc.
Posted 2 weeks ago
2.0 - 5.0 years
3 - 8 Lacs
Navi Mumbai
Work from Office
Job Title: Video Headend Engineer Payroll: Quess Corp Ltd Experience: 2-5 years of experience in Headend, System integration, development and operations. Reporting Location : RCP, Navi Mumbai Responsibilities/ Duties: Responsible for delivering Headend projects. Own, and be accountable for solution design, end-to end project delivery. Qualifications: •Bachelor’s degree in Electronic Engineering, Electrical Engineering, Electronic & Telecommunication Engineering, or similar technical field and/or equivalent experience. •BE/B.TECH in Computer Science (Verify if the candidate is comfortable working as a Headend Engineer) Technical Skills (Must): •End-to-End knowledge of Digital Headend in OTT/IPTV/CATV/DTH environment. •In depth knowledge of video standards i.e. Video Encoding/Transcoding (CBR/VBR) •Good knowledge of CAS/DRM •Knowledge of MPEG/DVB and video delivery systems (i.e. HLS, DASH) including Bitrate ladder management •Knowledge of networking protocols i.e. TCP/IP, HTTP, FTP, IGMP, UDP, RTP •Knowledge of streaming protocols i.e. HLS, DASH, RTMP, SRT •Hands on experience on configuration of: IRD/Encoder/MUX/Packager/Origin. •Hands on expertise on devices of ATEME and Broadpeak systems. •Advanced Knowledge in networking components –switches, routers, firewall etc •Ability to adapt quickly to new or unfamiliar technology and products using documentation and online resources. •Knowledge of video and audio encoding compression standards •Knowledge of MPEG-4 encoders, MPEG Transport Stream analyzers and satellite transmission equipment •Work at any time in a 24 X 7 staffed environment Other Skill sets (Desired): •Technical understanding of Windows and Linux OS. •Understanding basics of database. •Knowledge on basic CCNA level networking components. Non-Technical Skills: •Good command of written and spoken English. •Good expertise on MS Excel/MS Word/MS Power Point/MS Visio.
Posted 2 weeks ago
3.0 - 8.0 years
14 - 24 Lacs
Chennai
Work from Office
SUMMARY Job Title: Television Screen Repair Technician We are seeking an experienced TV Screen Repair Technician to join our technical service team. The ideal candidate will have 3 5 years of hands-on experience in diagnosing and repairing LED, LCD, OLED, and Smart TV screens across various brands. Key Responsibilities: Diagnose and repair screen-related issues including panel replacement, back light problems, and display distortions Perform board-level repairs (T-Con, Main board, Power Supply) Conduct functional testing after repairs to ensure quality and performance Maintain accurate service logs and repair reports Communicate effectively with customers and internal teams Handle service calls on-site or in the workshop as needed Requirements Requirements: 3 5 years of experience in TV repair, especially screen-related issues Preferred male candidate Indian nationality is preferred Benefits Salary Offered: 150 to 200 (depending on the online interview) Housing and Meals: Housing is provided Working Hours: From 9:00 AM to 7:00 PM Weekly Day Off: One day Required Languages: English
Posted 2 weeks ago
4.0 - 6.0 years
6 - 16 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
What youll be doing We are looking for data engineers who can work with world class team members to help drive telecom business to its full potential. We are building data products / assets for telecom wireless and wireline business which includes consumer analytics, telecom network performance and service assurance analytics etc. We are working on cutting edge technologies like digital twin to build these analytical platforms and provide data support for varied AI ML implementations. As a data engineer you will be collaborating with business product owners, coaches, industry renowned data scientists and system architects to develop strategic data solutions from sources which includes batch, file and data streams As a Data Engineer with ETL/ELT expertise for our growing data platform & analytics teams, you will understand and enable the required data sets from different sources both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Company. Understanding the business requirements and converting them to the technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. Understanding devops process and contributing for devops pipelines What were looking for... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. You’ll need to have: Bachelor’s degree or four or more years of work experience. Experience with Data Warehouse concepts and Data Management life cycle. Experience in GCP cloud platform - (BigQuery/Cloud Composer/Data Proc(or Hadoop+Spark))/Cloud Function). Experience in any programming language preferably Python. Proficiency in graph data modeling, including experience with graph data models and graph query language. Exposure in working on GenAI use cases. Experience in troubleshooting the data issues. Experience in writing complex SQL and performance tuning. Experience in DevOps Experience in GraphDB , Core Java Experience in real time streaming and lambda architecture. Role & responsibilities Preferred candidate profile
Posted 2 weeks ago
5.0 - 10.0 years
22 - 37 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Inviting applications for the role of Senior Principal Consultant-Data Engineer, AWS! Locations Bangalore, Hyderabad, Kolkata Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Masters Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process
Posted 2 weeks ago
5.0 - 10.0 years
6 - 15 Lacs
Bengaluru
Work from Office
Greetings!!! If you're interested please apply by clicking below link https://bloomenergy.wd1.myworkdayjobs.com/BloomEnergyCareers/job/Bangalore-Karnataka/Staff-Engineer---Streaming-Analytics_JR-19447 Role & responsibilities Our team at Bloom Energy embraces the unprecedented opportunity to change the way companies utilize energy. Our technology empowers businesses and communities to responsibly take charge of their energy. Our energy platform has three key value propositions: resiliency, sustainability, and predictability. We provide infrastructure that is flexible for the evolving net zero ecosystem. We have deployed more than 30,000 fuel cell modules since our first commercial shipments in 2009, sending energy platforms to data centers, hospitals, manufacturing facilities, biotechnology facilities, major retail stores, financial institutions, telecom facilities, utilities, and other critical infrastructure customers around the world. Our mission is to make clean, reliable energy affordable globally. We never stop striving to improve our technology, to expand and improve our company performance, and to develop and support the many talented employees that serve our mission! Role & responsibilities: Assist in developing distributed learning algorithms Responsible for building real-time analytics on cloud and edge devices Responsible for developing scalable data pipelines and analytics tools Solve challenging data and architectural problems using cutting edge technology Cross functional collaboration with data scientists / data engineering / firmware controls teams Skills and Experience: Strong Java/ Scala programming/debugging ability and clear design patterns understanding, Python is a bonus Understanding of Kafka/ Spark / Flink / Hadoop / HBase etc. internals (Hands on experience in one or more preferred) Implementing data wrangling, transformation and processing solutions, demonstrated experience of working with large datasets Knowhow of cloud computing platforms like AWS/GCP/Azure beneficial Exposure to data lakes and data warehousing concepts, SQL, NoSQL databases Working on REST APIs, gRPC are good to have skills Ability to adapt to new technology, concept, approaches, and environment faster Problem-solving and analytical skills Must have a learning attitude and improvement mindset
Posted 3 weeks ago
6.0 - 8.0 years
16 - 20 Lacs
Hyderabad
Work from Office
Role & responsibilities Key Responsibilities: • Develop real-time data streaming applications using Apache Kafka and Kafka Streams. • Build and optimize large-scale batch and stream processing pipelines with Apache Spark. • Containerize applications and manage deployments using OpenShift and Kubernetes. • Collaborate with DevOps teams to ensure CI/CD pipelines are robust and scalable. • Write unit tests and conduct code reviews to maintain code quality and reliability. • Work closely with Product and Data Engineering teams to understand requirements and translate them into technical solutions. • Troubleshoot and debug production issues across multiple environments. Required qualifications to be successful in this role • Strong programming skills in Java/Python. • Hands-on experience with Apache Kafka, Kafka Streams, and event-driven architecture. • Solid knowledge of Apache Spark (batch and streaming). • Experience with OpenShift, Kubernetes, and container orchestration. • Familiarity with microservices architecture, RESTful APIs, and distributed systems. • Experience with build tools such as Maven or Gradle. • Familiar with Git, Jenkins, CI/CD pipelines, and Agile development practices. • Excellent problem-solving skills and ability to work in a fast-paced environment. Education & Experience: • Bachelor's or Master's degree in Computer Science, Engineering, or related field. • Minimum 6 years of experience in backend development with Java and related technologies. Preferred Skills (Nice to Have): • Knowledge of cloud platforms like AWS, Azure, or GCP. • Understanding of security best practices in cloud-native environments. • Familiarity with SQL/NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). • Experience with Scala or Python for Spark jobs is a plus. Preferred candidate profile
Posted 3 weeks ago
6.0 - 11.0 years
3 - 8 Lacs
Pune
Remote
Role & responsibilities What You'll Do Build the underneath data platform and maintain data processing pipelines using best in class technologies. Special focus on R&D to challenge status-quo and build the next generation data mesh that is efficient and cost effective. Translate complex technical and functional requirements into detailed designs. Who You Are Strong programming skills (Python, Java, and Scala) Experience writing SQL , structuring data, and data storage practices Experience with data modeling Knowledge of data warehousing concepts Experienced building data pipelines and micro services Experience with Spark , Airflow and other streaming technologies to process incredible volumes of streaming data A willingness to accept failure, learn and try again An open mind to try solutions that may seem impossible at first Strong understanding of data structures, algorithms, multi-threaded programming, and distributed computing concepts Experience working on Amazon Web Services- AWS ( EMR, Kinesis, RDS, S3 , SQS and the like) Preferred candidate profile At least 6+ years of professional experience as a software engineer or data engineer Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field.
Posted 4 weeks ago
6.0 - 8.0 years
18 - 24 Lacs
Bengaluru
Work from Office
Design, develop, and optimize scalable data pipelines using Databricks and Apache Spark.Collaborate with data scientists, analysts, and business stakeholders to understand data needs.Build and manage ETL/ELT workflows in Databricks.
Posted 1 month ago
6.0 - 10.0 years
13 - 17 Lacs
Bengaluru
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above
Posted 1 month ago
6.0 - 10.0 years
13 - 17 Lacs
Pune
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities Let's unleash your full potential. See Beyond, Rise Above
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi