Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within…. A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities 0 -2 years of experience as AI/ML engineer or similar role. Strong knowledge of machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn). Hands-on experience with model development and deployment processes. Proficiency in programming languages such as Python. Experience with data preprocessing, feature engineering, and model evaluation techniques. Familiarity with cloud platforms (e.g., AWS) and containerization (e.g., Docker, Kubernetes). Familiarity with version control systems (e.g., GitHub). Proficiency in data manipulation and analysis using libraries such as NumPy and Pandas. Good to have knowledge of deep learning, ML Ops: Kubeflow, MLFlow, Nextflow. Knowledge on text Analytics, NLP, Gen AI Mandatory Skill Sets ML Ops, AI / ML Preferred Skill Sets ML Ops, AI / ML Years Of Experience Required 0 - 2 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Data Science Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Strategy {+ 22 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 19 hours ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: Python Developer – Web Scraping & ETL Specialist (E‑Commerce & Quick Commerce) Location: Gurugram, Haryana, India Experience: 3–6 years Employment Type: Full-time About Us We are a fast-growing e‑commerce and quick commerce enterprise dedicated to delivering top-tier customer experiences and rapid fulfillment. Powered by robust data pipelines and a customer-first mindset, our goal is to stay ahead in the dynamic world of digital commerce Role Overview We’re seeking a highly skilled Python Developer with a proven track record in web scraping, ETL, and database management, particularly within e‑commerce, marketplace, and quick commerce environments. In this role, you'll architect and maintain data pipelines that drive real-time pricing, inventory updates, and marketplace integrations. Key Responsibilities Design and implement scalable web scraping pipelines using Python frameworks like Scrapy, Selenium, BeautifulSoup, Playwright, etc. Inspired by real-world roles requiring dynamic content handling, headers, delays, proxy rotation, pagination handling, and structured output. Build ETL workflows to extract, clean, transform, and load data into relational or NoSQL databases; create and optimize stored procedures. This aligns with best practices seen in enterprise-scale ETL roles. Manage and optimize databases such as PostgreSQL, MySQL, SQL Server (including stored procedures, indexing, query optimization). ETL-centric web scraping jobs often emphasize strong DB proficiencies. Apply your e‑commerce expertise to capture product data, pricing, stock levels, seller info, and reviews. Roles in marketplace scraping frequently spotlight such domains. Support quick commerce data flows—real-time pipeline orchestration, rapid ingestion, monitoring, and alerting. Quick commerce engineering roles highlight data pipeline scalability, reliability, and tooling like Airflow and Spark. Collaborate with cross-functional squads—Ops, Product, Analytics—to align data delivery with business needs. Implement monitoring, error handling, and maintain ETL/scraping reliability. Required Qualifications Bachelor’s or Master’s in Computer Science, IT, or related field 3+ years of working experience with: Web scraping at scale (e‑commerce, marketplaces) using Python—BeautifulSoup, Scrapy, Selenium, Playwright, etc. Designing ETL pipelines and leveraging ETL tools/frameworks (e.g., Pentaho, Airflow, or custom Python ETLs) Database management—SQL, stored procedures, performance tuning Experience leveraging e‑commerce platforms/APIs or scraping data from marketplaces Understanding of quick commerce drivers—fast-moving datasets, low-latency ingestion, and scalable pipeline design Strong analytical and problem-solving skills, plus effective communication Preferred (Nice to Have) Familiarity with cloud platforms—AWS, GCP, Azure—and services like Lambda, BigQuery, Cloud Functions, etc. Proficiency with workflow orchestration tools like Apache Airflow, or experience implementing ETL scheduling Knowledge of NoSQL databases Containerization experience (e.g., Docker) Exposure to quick commerce data modeling or marketplace data architecture Why Join Us Be at the forefront of e‑commerce & quick commerce innovation Hands-on involvement in building real-time data systems that power business decisions Collaborative, fast-paced, and growth-oriented environment Attractive salary, benefits, and networking opportunities in a rapidly scaling ecosystem Interested Candidate can share their resume to hr@trailytics.com
Posted 19 hours ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: · Design, develop, and maintain data pipelines using Azure Data Factory (ADF) for ingestion, transformation, and integration of data from various energy systems. · Implement workflows using Logic Apps and Function Apps to automate data processing and system interactions. · Manage and optimize data storage using Azure Data Lake Storage Gen2 (ADLS Gen2) and Azure SQL databases. · Integrate real-time and batch data from IoT devices and energy platforms using Event Hub and Service Bus. · Collaborate with cross-functional teams to support data needs across power distribution, automation, and monitoring solutions. · Ensure data quality, security, and compliance with industry standards and regulations. · Document data architecture, workflows, and best practices for operational transparency and scalability. Mandatory skill sets: Azure Data Factory, Logic Apps, Function Apps, Azure SQL, Event Hub, Service Bus, and ADLS Preferred skill sets: Python, SQL, and data modeling Years of experience required: 3 to 10 Years Education qualification: Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 19 hours ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: · Design, develop, and maintain data pipelines using Azure Data Factory (ADF) for ingestion, transformation, and integration of data from various energy systems. · Implement workflows using Logic Apps and Function Apps to automate data processing and system interactions. · Manage and optimize data storage using Azure Data Lake Storage Gen2 (ADLS Gen2) and Azure SQL databases. · Integrate real-time and batch data from IoT devices and energy platforms using Event Hub and Service Bus. · Collaborate with cross-functional teams to support data needs across power distribution, automation, and monitoring solutions. · Ensure data quality, security, and compliance with industry standards and regulations. · Document data architecture, workflows, and best practices for operational transparency and scalability. Mandatory skill sets: Azure Data Factory, Logic Apps, Function Apps, Azure SQL, Event Hub, Service Bus, and ADLS Preferred skill sets: Python, SQL, and data modeling Years of experience required: 3 to 10 Years Education qualification: Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master Degree, Bachelor Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 19 hours ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: · Design, develop, and maintain data pipelines using Azure Data Factory (ADF) for ingestion, transformation, and integration of data from various energy systems. · Implement workflows using Logic Apps and Function Apps to automate data processing and system interactions. · Manage and optimize data storage using Azure Data Lake Storage Gen2 (ADLS Gen2) and Azure SQL databases. · Integrate real-time and batch data from IoT devices and energy platforms using Event Hub and Service Bus. · Collaborate with cross-functional teams to support data needs across power distribution, automation, and monitoring solutions. · Ensure data quality, security, and compliance with industry standards and regulations. · Document data architecture, workflows, and best practices for operational transparency and scalability. Mandatory skill sets: Azure Data Factory, Logic Apps, Function Apps, Azure SQL, Event Hub, Service Bus, and ADLS Preferred skill sets: Python, SQL, and data modeling Years of experience required: 3 to 10 Years Education qualification: Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 19 hours ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Job Description Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: · Design, develop, and maintain data pipelines using Azure Data Factory (ADF) for ingestion, transformation, and integration of data from various energy systems. · Implement workflows using Logic Apps and Function Apps to automate data processing and system interactions. · Manage and optimize data storage using Azure Data Lake Storage Gen2 (ADLS Gen2) and Azure SQL databases. · Integrate real-time and batch data from IoT devices and energy platforms using Event Hub and Service Bus. · Collaborate with cross-functional teams to support data needs across power distribution, automation, and monitoring solutions. · Ensure data quality, security, and compliance with industry standards and regulations. · Document data architecture, workflows, and best practices for operational transparency and scalability. Mandatory skill sets: Azure Data Factory, Logic Apps, Function Apps, Azure SQL, Event Hub, Service Bus, and ADLS Preferred skill sets: Python, SQL, and data modeling Years of experience required: 3 to 10 Years Education qualification: Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 19 hours ago
6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Our world is transforming, and PTC is leading the way. Our software brings the physical and digital worlds together, enabling companies to improve operations, create better products, and empower people in all aspects of their business. Our people make all the difference in our success. Today, we are a global team of nearly 7,000 and our main objective is to create opportunities for our team members to explore, learn, and grow – all while seeing their ideas come to life and celebrating the differences that make us who we are and the work we do possible. About PTC: PTC (NASDAQ: PTC) enables global manufacturers to achieve significant digital transformation through our market-leading software solutions. We empower customers to innovate faster, improve operations, and drive business growth—whether on-premises, in the cloud, or through our SaaS platform. At PTC, we don’t just imagine a better world—we enable it. Role Overview: As a Senior Technical Support Specialist , you will serve as a key technical advisor and escalation point within the Servigistics Support organization. You will bring your rich industry experience to drive strategic customer success, mentor junior team members, and lead complex troubleshooting efforts. You will work cross-functionally with engineering, product management, and customer teams to ensure seamless and proactive technical support delivery. Key Responsibilities: Serve as the primary technical contact for high-priority and complex customer escalations. Lead resolution of mission-critical issues involving product functionality, performance, and deployment. Partner with global cross-functional teams to ensure holistic and timely resolution of customer challenges. Proactively identify and drive improvements in support processes and product usability. Contribute to and review KCS-aligned knowledge articles and promote customer self-service strategies. Collaborate with product and engineering teams to influence product roadmap based on customer feedback and insights. Mentor and guide junior technical support engineers; provide coaching and best practices. Represent support in customer meetings, escalations, and business reviews. Maintain high SLA compliance for enterprise customers with complex environments. Available to work 24x7 on rotational basics and willingness to support weekend shifts when scheduled ensuring readiness for global support needs. Required Skills & Competencies: Strong experience in diagnosing and resolving enterprise-grade application issues across multiple layers (web, application, and database). Deep expertise in SQL (Oracle and SQL Server), with ability to write and optimize complex queries. Hands-on experience with ETL tools (Informatica, IICS, Kettle/Pentaho) and resolving batch job failures. Solid understanding of open-source web technologies such as Apache Tomcat and Apache Web Server. Experience in performance tuning, server configuration, log analysis, and application scalability. Knowledge of Java-based enterprise applications and implementation or support lifecycle. Familiarity with enterprise IT environments (networks, load balancing, security protocols, integrations). Proven ability to work independently under pressure while managing multiple complex issues. Preferred Qualifications: Experience with UNIX/Linux environments and command-line utilities. Knowledge of cloud platforms such as AWS including services S3. Exposure to machine learning concepts and their integration within enterprise systems Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. 6+ years of relevant technical support, implementation, or consulting experience in enterprise software. Excellent written and verbal communication skills; able to interact confidently with senior stakeholders. Why Join PTC? Work with innovative products and talented global teams. Collaborative and inclusive culture where your voice matters. Extensive benefits including: Best-in-class insurance Employee stock purchase plan and RSUs Generous PTO and paid parental leave Flexible work hours and no probation clause Career growth opportunities and higher education support Life at PTC is about more than working with today’s most cutting-edge technologies to transform the physical world. It’s about showing up as you are and working alongside some of today’s most talented industry leaders to transform the world around you. If you share our passion for problem-solving through innovation, you’ll likely become just as passionate about the PTC experience as we are. Are you ready to explore your next career move with us? We respect the privacy rights of individuals and are committed to handling Personal Information responsibly and in accordance with all applicable privacy and data protection laws. Review our Privacy Policy here."
Posted 19 hours ago
7.0 years
3 - 5 Lacs
Hyderābād
On-site
Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 06-Aug-2025 Job ID 11153 Description and Requirements Position Summary This position is responsible for design and implementation of application platform solutions, with an initial focus on Customer Communication Management (CCM) platforms such as enterprise search and document generation/workflow products such as Quadient, xPression, Documaker, WebSphere Application Server (WAS), and technologies from OpenText. While gaining and providing expertise on these key business platforms, the Engineer will identify opportunities for automation and cloud-enablement across other technologies within the Platform Engineering portfolio and developing cross-functional expertise Job Responsibilities Provide design and technical support to application developers and operations support staff when required. This includes promoting the use of best practices, ensuring standardization across applications and troubleshooting Design and implement complex integration solutions through collaboration with engineers and application teams across the global enterprise Promote and utilize automation to design and support configuration management, orchestration, and maintenance of the integration platforms using tools such as Perl, Python, and Unix shell Collaborate with senior engineers to understand emerging technologies and their effect on unit cost and service delivery as part of the evolution of the integration technology roadmap Investigate, recommend, implement, and maintain CCM solutions across multiple technologies Investigation of released fix packs, provide well documented instructions and script automation to operations for implementation in collaboration with Senior Engineers in support of platform currency Capacity reviews of current platform Participate in cross-departmental efforts Leads initiatives within the community of practice Willing to work in rotational shifts Good Communication skill with the ability to communicate clearly and effectively Knowledge, Skills and Abilities Education Bachelor's degree in computer science, Information Systems, or related field. Experience 7+ years of total experience in designing, developing, testing and deploying n-tier applications built on java, python, WebSphere Application Server, Liberty, Apache Tomcat etc At least 4+ years of experience on Customer Communication Management (CCM) and Document Generation platforms such as Quadient, xPression, Documaker. Linux/Windows OS Apache / HIS IBM WebSphere Application Server, Liberty Quadient, xPression Ansible Shell scripting (Linux, Powershell) Json/Yaml Ping, SiteMinder Monitoring & Observability (Elastic, AppD, Kibana) Troubleshooting Log & Performance Analysis OpenShift Other Requirements (licenses, certifications, specialized training – if required) About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!
Posted 19 hours ago
5.0 - 8.0 years
3 - 7 Lacs
Hyderābād
On-site
Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description Description: Installation Of Apache/DSE Cassandra DB on Bare Metal servers or cloud Upgrade of existing Apache/DSE Cassandra DB to higher version. Data Migration activities on Cassandra DB. Debugging Issues and fixing on Cassandra DB with Application teams. Capacity Planning. Adding / Removing Cassandra nodes or the whole DC to existing DC. Working on automating existing system monitoring/creating dashboard using Shell script, Python. Working on Ansible, Dockers. Good understanding of ansible playbooks. Setting Up OpsCenter. Good understanding of certificates, how keystore/truststore works in DB. Good exposures to authenticating Cassandra DB using JMX. Good understanding of Heap, Garbage collection works in DB. 11) Working on Central station tickets and Radars assigned by different application teams. Basic understanding of Elasticsearch, creating Elastic Search index, creating dashboard using index metrices. Basic understanding of Solr/Spark, creating Solr indexes, Spark jobs . 2. Experience supporting Linux or cloud environments. 5. Strong Diagnostic, Problem-solving and decision-making abilities. 8. Ability to work independently and with others to support local, regional and global teams. 9. Manage multiple assignments in a fast-paced environments. 10. knowledge of Ticketing tool like remedy 11. Solid understandings of ITIL principles and SLAs. 12. On call support Rotation-wise. ͏ ͏ ͏ ͏ Mandatory Skills: Oracle Database Admin. Experience: 5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 19 hours ago
5.0 years
4 - 9 Lacs
Hyderābād
Remote
Data Engineer Remote role based in India. Note - This is a full time, remote, salaried position through Red Elk Consulting, llc, based in India. This role is 100% focused and dedicated to supporting Together Labs, as a consultant, and includes; salary, benefits, vacation, and a local India - based support team We are seeking an experienced and motivated Data Engineer to join our dynamic data team. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines, managing our data warehouse infrastructure, and supporting analytics initiatives across the organization. You will work closely with data scientists, analysts, and other stakeholders to ensure data quality, integrity, and accessibility, enabling the organization to make data-driven decisions. RESPONSIBILITIES Design and Develop Data Pipelines: Architect, develop, and maintain robust and scalable data pipelines for ingesting, processing, and transforming large volumes of data from multiple sources in real-time and batch modes . Data Warehouse Management: Manage, optimize, and maintain the data warehouse infrastructure, ensuring data integrity, security, and availability. Oversee the implementation of best practices for data storage, partitioning, indexing, and schema design. ETL Processes: Design and build efficient ETL (Extract, Transform, Load) processes to move data across various systems while ensuring high performance, reliability, and scalability. Data Integration: Integrate diverse data sources (structured, semi-structured, and unstructured data) into a unified data model that supports analytics and reporting needs. Support Analytics and BI: Collaborate with data analysts, data scientists, and business intelligence teams to understand data requirements and provide data sets, models, and solutions that support their analytics needs. Data Quality and Governance: Establish and enforce data quality standards, governance policies, and best practices. Implement monitoring and alerting to ensure data accuracy, consistency, and completeness. Operational Excellence: Drive the development of automated systems for provisioning, deployment, monitoring, failover, and recovery. Implement systems to monitor key performance metrics, logs, and alerts with a focus on automation and reducing manual intervention. Cross-functional Collaboration: Work closely with product, engineering, and QA teams to ensure the infrastructure supports and enhances development workflows and that services are deployed and operated smoothly at scale. Incident Management & Root Cause Analysis: Act as a first responder to data production issues, leading post-mortems and implementing long-term solutions to prevent recurrence. Ensure all incidents are handled promptly with a focus on minimizing impact. Security & Compliance: Ensure our infrastructure is designed with security best practices in mind, including encryption, access control, and vulnerability scanning. Continuous Improvement: Stay up-to-date with industry trends, technologies, and best practices, bringing innovative ideas into the team to improve reliability, performance, and scale. QUALIFICATIONS Education & Experience: Bachelor’s degree in Computer Science, Engineering, or related technical field, or equivalent practical experience. 5+ years of experience in data engineering, with a strong background in systems architecture, distributed systems, cloud infrastructure, or a related field. Proven experience building and managing data pipelines, data warehouses, and ETL processes. Technical skills: Strong proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, Oracle) and data warehousing solutions (e.g., Snowflake, Redshift, BigQuery). Expertise in data pipeline tools and frameworks (e.g., AWS Glue, Google Dataflow, Apache Airflow, Apache NiFi, dbt). Hands-on experience with cloud platforms and their data services (e.g., AWS, Azure, Google Cloud Platform). Proficiency in programming languages such as Python, Java, or Scala for data manipulation and automation. Knowledge of data modeling, schema design, and data governance principles. Familiarity with distributed data processing frameworks like Apache Spark, Hadoop, or similar. Experience with BI tools (e.g., Tableau, Power BI, Looker) Experience with AWS and standard practices working in Cloud based environments Soft Skills: Strong problem-solving and analytical skills with a keen attention to detail. Excellent communication and collaboration skills, with the ability to work effectively with technical and non-technical stakeholders. Proactive mindset with the ability to work independently and handle multiple tasks in a fast-paced environment. ABOUT US Together Labs innovates technologies that empower people worldwide to connect, create and earn in virtual worlds. Our mission is to redefine social media as a catalyst for authentic human connection through the development of a family of products grounded in this core value. These include: IMVU, the world's largest friendship discovery and social platform and VCOIN, the first regulatory-approved transferable digital currency;. For more information, please visit https://togetherlabs.com/ Founded in 2004 and based in the heart of Silicon Valley, Together Labs is led by a team that's dedicated to pioneering in virtual worlds. Together Labs is backed by venture investors Allegis Capital, Bridgescale Partners and Best Buy Capital. Together Labs (formerly IMVU) has been for nine years running as Best Place to Work in the Silicon Valley. HOW TO APPLY Please familiarize yourself with our products and feel free to try out our core product at https://www.imvu.com/ Together Labs is an equal opportunity employer, and is committed to fostering a culture of inclusion. Our unique differences enable us to learn, collaborate, and grow together. We welcome all applicants without regard to race, color, religious creed, sex, national origin, citizenship status, age, physical or mental disability, sexual orientation, gender identification, marital, parental, veteran or military status, unfavorable military discharge, decisions regarding reproductive health, or any other status protected by applicable federal, state, or local law. This is a remote position.
Posted 19 hours ago
8.0 years
6 - 9 Lacs
Hyderābād
On-site
About the Role: Grade Level (for internal use): 11 The Team: S&P Global Commodity Insights empowers organizations to create long-term, sustainable value by providing data and insights for a comprehensive view of the global energy and commodities markets. The Impact: You will play a crucial role in developing products for the Environmental Solutions group. As Lead, you'll lead, architect, and build the data engineering needs for carbon registry products. What’s in it for you: It’s a fast-paced agile environment that deals with huge volumes of data, so you’ll have an opportunity to sharpen your data skills and work on an emerging technology stack. Responsibilities: Architect, design, and develop solutions within a multi-functional Agile team to support key business needs for Commodity Insights Design and implement software components for content systems. Perform analysis and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. Engineer components and common services based on standard corporate development models, languages, and tools Apply software engineering best practices while also leveraging automation across all elements of solution delivery. Collaborate effectively with technical and non-technical stakeholders. Must be able to document and demonstrate technical solutions by developing documentation, diagrams, code comments, etc. What We’re Looking For: Basic Qualifications: Bachelor’s/master’s degree in computer science, Information Systems, or equivalent. 8+ years’ experience in application development Strong C# or Java and SQL skills Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development Experience with SQL and NoSQL databases Experience with Big Data platforms such as Apache Spark Expertise in building event-driven, scalable & resilient systems Demonstrates a thorough understanding of information systems, business processes, the key drivers, and measures of success while choosing the proper methodologies and policies to support broad business goals Experience working in Databricks Experience working in cloud computing environments such as AWS, Azure, or GCP Experience in the Financial services domain is a plus. Preferred Qualifications: Experience within the carbon registry space Familiarity and/or enthusiasm with Data Science / Machine Learning is a plus. Experience in Snowflake #LI-USA About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. We’re a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights’ coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSE: SPGI). S&P Global is the world’s foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world’s leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit http://www.spglobal.com/commodity-insights . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 318828 Posted On: 2025-08-07 Location: Hyderabad, Telangana, India
Posted 19 hours ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Key Responsibilities We are currently seeking a skilled and experienced Java J2EE Developer with a minimum 8 years of hands-on experience Capability to create design solutions independently for a given module Develop and maintain web applications using Java Spring Boot user interfaces using HTML CSS JavaScript Write and maintain unit tests using Junit and Mockito Deploy and manage applications on servers such as JBoss WebLogic Apache and Nginx Ensure application security Familiarity with build tools such as Maven and Gradle Experience with caching technologies like Redis and Coherence Understanding of Spring Security Knowledge of Groovy is a plus Excellent problem-solving skills and attention to detail Strong communication and teamwork abilities Qualifications Bachelors degree in Computer Science Information Technology or a related field 6 8 years of experience in full stack development Proven track record of delivering high quality software solutions with cross functional teams to define design and ship new features Troubleshoot and resolve issues in a timely manner Stay updated with the latest industry trends and technologies Should have knowledge on SQL Required Skills Proficiency in HTML CSS and JavaScript Strong experience with Java and Spring frameworks Spring Boot SQL and with familiarity on CI/CD .
Posted 19 hours ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Key Responsibilities We are currently seeking a skilled and experienced Java J2EE Developer with a minimum 8 years of hands-on experience Capability to create design solutions independently for a given module Develop and maintain web applications using Java Spring Boot user interfaces using HTML CSS JavaScript Write and maintain unit tests using Junit and Mockito Deploy and manage applications on servers such as JBoss WebLogic Apache and Nginx Ensure application security Familiarity with build tools such as Maven and Gradle Experience with caching technologies like Redis and Coherence Understanding of Spring Security Knowledge of Groovy is a plus Excellent problem-solving skills and attention to detail Strong communication and teamwork abilities Qualifications Bachelors degree in Computer Science Information Technology or a related field 6 8 years of experience in full stack development Proven track record of delivering high quality software solutions with cross functional teams to define design and ship new features Troubleshoot and resolve issues in a timely manner Stay updated with the latest industry trends and technologies Should have knowledge on SQL Required Skills Proficiency in HTML CSS and JavaScript Strong experience with Java and Spring frameworks Spring Boot SQL and with familiarity on CI/CD .
Posted 19 hours ago
8.0 years
0 Lacs
Hyderābād
On-site
At least 8+ years of experience and strong knowledge in Scala programming language. Able to write clean, maintainable and efficient Scala code following best practices. Good knowledge on the fundamental Data Structures and their usage At least 8+ years of experience in designing and developing large scale, distributed data processing pipelines using Apache Spark and related technologies. Having expertise in Spark Core, Spark SQL and Spark Streaming. Experience with Hadoop, HDFS, Hive and other BigData technologies. Familiarity with Data warehousing and ETL concepts and techniques Having expertise in Database concepts and SQL/NoSQL operations. UNIX shell scripting will be an added advantage in scheduling/running application jobs. At least 8 years of experience in Project development life cycle activities and maintenance/support projects. Work in an Agile environment and participation in scrum daily standups, sprint planning reviews and retrospectives. Understand project requirements and translate them into technical solutions which meets the project quality standards Ability to work in team in diverse/multiple stakeholder environment and collaborate with upstream/downstream functional teams to identify, troubleshoot and resolve data issues. Strong problem solving and Good Analytical skills. Excellent verbal and written communication skills. Experience and desire to work in a Global delivery environment. Stay up to date with new technologies and industry trends in Development. Job Types: Full-time, Permanent, Contractual / Temporary Pay: ₹5,000.00 - ₹9,000.00 per day Work Location: In person
Posted 19 hours ago
1.0 years
1 - 5 Lacs
Hyderābād
On-site
Are you looking for an opportunity to join a team of engineers in positively affecting the experience of every consumer who uses Microsoft products? The OSSE team in OPG group is focused on building client experiences and services that light up Microsoft Account experiences across all devices and platforms. We are passionate about working together to build delightful and inclusive account experiences that empower customers to get the most out of what Microsoft has to offer. We’re looking for a collaborative, inclusive and customer obsessed engineer to help us build and sustain authentication experiences like Passkeys as well as engage with our customers by building experiences to help users keep their account secure and connected across multiple devices and applications. We're looking for an enthusiastic Software Engineer to help us build account experiences and deliver business Intelligence through data for experiences across 1.5 billion Windows devices and various Microsoft products. Your responsibilities will include working closely with a variety of teams such as Engineering, Program Management, Design and application partners to understand the key business questions for customer-facing scenarios, to set up the key performance indicators, and setup data pipelines to identify insights and experiment ideas that moves our business metrics. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Enable the Windows, Developers, and Experiences team to do more with data across all aspects of the development lifecycle. Contribute to a data-driven culture as well as a culture of experimentation across the organization. Provide new and improve upon existing platform offerings with a fundamental understanding of the end-to-end scenarios. Collaborate with partner teams and customers to scope and deliver projects. You’ll write secure, reliable, scalable, and maintainable code, and then effectively debug it, test it, and support it. Authoring and design of Big Data ETL pipelines in SCOPE, Scala, SQL, Python, or C#. Qualifications Required Qualifications: Bachelor's Degree in Computer Science, or related technical discipline with proven experience coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience. Proven coding and debugging skills in C#, C++, Java, or SQL. Ability to work and communicate effectively across disciplines and teams. Preferred Qualifications: 1+ years of experience in data engineering. Understanding and experience with data cloud computing technologies such as – Azure Synapse, Azure Data Factory, SQL, Azure Data Explorer, Power BI, PowerApps, Hadoop, YARN, Apache Spark. Excellent analytical skills with systematic and structured approach to software design. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 19 hours ago
3.0 - 5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
TransUnion's Job Applicant Privacy Notice What We'll Bring TransUnion is a global information and insights company that makes trust possible in the modern economy. We do this by providing a comprehensive picture of each person so they can be reliably and safely represented in the marketplace. As a result, businesses and consumers can transact with confidence and achieve great things. We call this Information for Good.® A leading presence in more than 30 countries across five continents, TransUnion provides solutions that help create economic opportunity, great experiences and personal empowerment for hundreds of millions of people. What You'll Bring As consultant on our team, you will join a global group of statisticians, data scientists, and industry experts on a mission to extract insights from data and put them to good use. You will have an opportunity to be a part of a variety of analytical projects in a collaborative environment and be recognized for the work you deliver. TransUnion offers a culture of lifelong learning and as an associate here, your growth potential is limitless. The consultant role within the Research and Consulting team is responsible for delivering market-level business intelligence both to TransUnion’s senior management and to Financial Services customers. You will work on projects across international markets, including Canada, Hong Kong, UK, South Africa, Philippines, and Colombia. To be successful in this position, you must have good organizational skills, a strategic mindset, and a flexible predisposition. You will also be expected to operate independently and able to lead and present projects with minimal supervision. How You’ll Contribute You will develop a strong understanding of consumer credit data and how it applies to industry trends and research across different international markets You will dig in by extracting data and performing segmentation and statistical analyses on large population datasets (using languages such as R, SQL, and Python on Linux and PC computing platforms) You will conduct analyses and quantitative research studies designed to understand complex industry trends and dynamics, leveraging a variety of statistical techniques You will deliver analytic insights and recommendations in succinct and compelling presentations for internal and external customers at various levels including an executive audience; you may lead key presentations to clients You will perform multiple tasks simultaneously and deal with changing requirements and deadlines You will develop strong consulting skills to be able to help external customers by understanding their business needs and aligning them with TransUnion’s product offerings and capabilities You will help to cultivate an environment that promotes excellence, innovation, and a collegial spirit Through all these efforts, you will be a key contributor to driving the perception of TransUnion as an authority on lending dynamics and a worthwhile, trusted partner to our clients and prospects Impact You'll Make What you'll bring: A Bachelor’s or Master’s degree in Statistics, Applied Mathematics, Operations Research, Economics, or an equivalent discipline Minimum 3-5 years of experience in a relevant field, such as data analytics, lending, or risk strategy Advanced proficiency with one or more statistical programming languages such as R Advanced proficiency writing SQL queries for data extraction Experience with big data platforms (e.g. Apache Hadoop, Apache Spark) preferred Advanced experience with the MS Office suite, particularly Word, Excel, and PowerPoint Strong time management skills with the ability to prioritize and contribute to multiple assignments simultaneously Excellent verbal and written communication skills. You must be able to clearly articulate ideas to both technical and non-technical audiences Highly analytical mindset with the curiosity to dig deeper into data, trends, and consumer behavior A strong interest in the areas of banking, consumer lending, and finance is paramount, with a curiosity as to why consumers act the way they do with their credit Strong work ethic with the passion for team success This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Sr Consultant, Data Analysis and Consulting
Posted 19 hours ago
2.0 - 3.0 years
8 Lacs
Thiruvananthapuram
On-site
2 - 3 Years 2 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in data pipeline development including coding testing and implementing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be adept at using ETL tools such as Informatica Glue Databricks and DataProc along with coding skills in Python PySpark and SQL. Works independently according to work allocation. Outcomes: Operate with minimal guidance to develop error-free code test applications and document the development process. Understand application features and component designs to develop them in accordance with user stories and requirements. Code debug test document and communicate the stages of product component or feature development. Develop optimized code using appropriate approaches and algorithms while adhering to standards and security guidelines independently. Complete foundational level certifications in Azure AWS or GCP. Demonstrate proficiency in writing advanced SQL queries. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Documentation: Create comprehensive documentation for personal work and ensure it aligns with project standards. Configuration: Follow the configuration process diligently. Testing: Create and conduct unit tests for data pipelines and transformations to ensure data quality and correctness. Domain Relevance: Develop features and components with a solid understanding of the business problems being addressed for the client. Defect Management: Raise fix and retest defects in accordance with project standards. Estimation: Estimate time effort and resource dependencies for personal work. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Release Management: Adhere to the release management process for seamless deployment. Design Understanding: Understand the design and low-level design (LLD) and link it to requirements and user stories. Certifications: Obtain relevant technology certifications to enhance skills and knowledge. Skill Examples: Proficiency in SQL Python or other programming languages utilized for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Knowledge Examples: Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow and Azure ADF/ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models Additional Comments: Job Description - Job Description Strong written and verbal communication skills in English. • Ability to work in 24x7 shift schedules, including night shifts for extended periods. • Analytical and problem-solving skills to diagnose and address data-related issues. • Proficiency in writing SQL queries for data extraction and analysis. • Hands-on experience with MS Excel for data analysis. • Ability to work independently under minimal supervision while following SOPs. • Strong attention to detail and ability to manage multiple monitoring tasks effectively. As an L1 Data Ops Analyst, you will be responsible for monitoring data pipelines, dashboards, and databases to ensure smooth operations. You will follow Standard Operating Procedures (SOPs) and runbooks to identify, escalate, and resolve issues with minimal supervision. Strong analytical skills, attention to detail, and the ability to work in a fast-paced, 24x7 environment are critical for this role. Key Responsibilities: • Monitor various dashboards, s, and databases continuously for a 9-hour shift. • Identify and escalate system or data anomalies based on predefined thresholds. • Follow SOPs and runbooks to troubleshoot and resolve basic data issues. • Work closely with L2 and L3 support teams for issue escalation and resolution. • Write and execute basic SQL queries for data validation and troubleshooting. • Analyze and interpret data using MS Excel to identify trends or anomalies. • Maintain detailed logs of incidents, resolutions, and escalations. • Communicate effectively with stakeholders, both verbally and in writing. Skills Sql,Data Analysis,Ms Excel,Dashboards About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 19 hours ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Key Responsibilities We are currently seeking a skilled and experienced Java J2EE Developer with a minimum 8 years of hands-on experience Capability to create design solutions independently for a given module Develop and maintain web applications using Java Spring Boot user interfaces using HTML CSS JavaScript Write and maintain unit tests using Junit and Mockito Deploy and manage applications on servers such as JBoss WebLogic Apache and Nginx Ensure application security Familiarity with build tools such as Maven and Gradle Experience with caching technologies like Redis and Coherence Understanding of Spring Security Knowledge of Groovy is a plus Excellent problem-solving skills and attention to detail Strong communication and teamwork abilities Qualifications Bachelors degree in Computer Science Information Technology or a related field 6 8 years of experience in full stack development Proven track record of delivering high quality software solutions with cross functional teams to define design and ship new features Troubleshoot and resolve issues in a timely manner Stay updated with the latest industry trends and technologies Should have knowledge on SQL Required Skills Proficiency in HTML CSS and JavaScript Strong experience with Java and Spring frameworks Spring Boot SQL and with familiarity on CI/CD .
Posted 19 hours ago
10.0 years
6 - 9 Lacs
Gurgaon
On-site
Company Description We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (17500+ experts across 39 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience 10+ years Extensive experience in back-end development utilizing Java 8 or higher, Spring Framework (Core/Boot/MVC), Hibernate/JPA, and Microservices Architecture. Strong experience in AWS (API Gateway, Fargate, S3, DynamoDB, SNS). Strong experience in SOAP and PostgreSQL. Hands-on experience with REST APIs, Caching system (e.g Redis) and messaging systems like Kafka etc. Proficiency in Service-Oriented Architecture (SOA) and Web Services (Apache CXF, JAX-WS, JAX-RS, SOAP, REST). Hands-on experience with multithreading, and cloud development. Strong working experience in Data Structures and Algorithms, Unit Testing, and Object-Oriented Programming (OOP) principles. Experience with DevOps tools and technologies such as Ansible, Docker, Kubernetes, Puppet, Jenkins, and Chef. Proficiency in build automation tools like Maven, Ant, and Gradle. Hands on experience on cloud technologies such as AWS/ Azure. Strong understanding of UML and design patterns. Ability to simplify solutions, optimize processes, and efficiently resolve escalated issues. Strong problem-solving skills and a passion for continuous improvement. Excellent communication skills and the ability to collaborate effectively with cross-functional teams. Enthusiasm for learning new technologies and staying updated on industry trends RESPONSIBILITIES: Writing and reviewing great quality code Understanding functional requirements thoroughly and analyzing the client’s needs in the context of the project Envisioning the overall solution for defined functional and non-functional requirements, and being able to define technologies, patterns and frameworks to realize it Determining and implementing design methodologies and tool sets Enabling application development by coordinating requirements, schedules, and activities. Being able to lead/support UAT and production roll outs Creating, understanding and validating WBS and estimated effort for given module/task, and being able to justify it Addressing issues promptly, responding positively to setbacks and challenges with a mindset of continuous improvement Giving constructive feedback to the team members and setting clear expectations. Helping the team in troubleshooting and resolving of complex bugs Coming up with solutions to any issue that is raised during code/design review and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field.
Posted 19 hours ago
7.0 years
10 - 17 Lacs
Gurgaon
On-site
Job Title: Senior Data Engineer Location: Gurugram, Haryana (Onsite) Experience Required: 7+ Years Job Overview We are hiring a highly experienced Senior Data Engineer to join our data engineering team in Gurugram . This is a full-time onsite role for a motivated professional with deep expertise in building scalable data infrastructure and modern data pipelines. Key Responsibilities Design and build scalable batch and real-time data pipelines Develop and manage data warehousing solutions using modern architectural patterns Work with AWS Data Services including S3, Glue, Athena, EMR, Kinesis Implement CDC (Change Data Capture) patterns and real-time data ingestion Use and manage file formats such as Parquet, Delta Lake, Apache Iceberg, and Hudi Handle stream processing using Apache Flink, Kafka Streams , or PySpark Orchestrate workflows using Apache Airflow Optimize data models for relational and NoSQL databases Ensure robust data integration and maintain cloud-based data systems Collaborate with cross-functional teams and manage stakeholders effectively Required Skills & Qualifications 7+ years of experience in Data Engineering Strong understanding of Data Warehousing and Architectural Patterns Proficiency in SQL and handling big data formats Hands-on experience with AWS data tools Solid knowledge of data streaming , workflow orchestration , and cloud data systems Excellent communication and stakeholder management skills Why Join Us? Competitive Salary Work with a forward-thinking team on cutting-edge data solutions Opportunity to lead and make an impact on large-scale data initiatives How to Apply If this role aligns with your background and career goals, we’d love to hear from you. Please submit your updated resume for further consideration. Job Types: Full-time, Permanent Pay: ₹1,000,000.00 - ₹1,700,000.00 per year Benefits: Health insurance Provident Fund Ability to commute/relocate: Gurgaon, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Have you worked extensively with AWS Data Services such as S3, Glue, Athena, EMR, or Kinesis? Are you proficient in working with file formats like Parquet, Delta Lake, Apache Iceberg, or Hudi? Do you have hands-on experience in building scalable batch and real-time data pipelines? Are you familiar with CDC (Change Data Capture) patterns and workflow orchestration tools like Apache Airflow? Have you worked with stream processing tools such as Apache Flink, Kafka Streams, or PySpark? Experience: Data Engineering: 7 years (Required) Work Location: In person
Posted 19 hours ago
5.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
StackBill is looking for a passionate and skilled Cloud Support Engineer to join our dynamic support team. This is an exciting opportunity to work on a leading Apache CloudStack-based cloud platform, supporting public cloud infrastructure across VMware and KVM environments. You’ll play a crucial role in maintaining high availability and performance for our clients, troubleshooting complex infrastructure issues, and contributing to a reliable and scalable cloud experience. 🔧 What You’ll Do Deliver L2/L3 technical support for customers on StackBill’s Apache CloudStack cloud platform Troubleshoot and resolve issues across compute, network, and storage layers (VMware ESXi, KVM, Ceph, NFS, etc.) Monitor system performance and ensure SLAs are consistently met Perform incident management, root cause analysis, and post-mortem reporting Collaborate with engineering teams to deploy, configure, and optimize CloudStack environments Guide customers on scaling strategies, best practices, and cloud performance tuning Create and maintain internal knowledge base and troubleshooting documentation 🧩 What You Bring 3–5 years of hands-on experience with VMware ESXi/vSphere and KVM Strong Linux administration skills (CentOS/Ubuntu) Good grasp of core networking: VLANs, VXLAN, SDN, Load Balancers, VPN Experience with storage technologies (Ceph, NFS, iSCSI) Sound troubleshooting abilities in complex infrastructure setups Familiarity with Apache CloudStack architecture is a strong advantage ⭐ Nice to Have Experience with DevOps tools: Ansible, Terraform, Jenkins, Git Familiarity with monitoring tools: Prometheus, Grafana, Zabbix Scripting (Bash/Python) for automation and orchestration Exposure to public cloud platforms (AWS, Azure, GCP) 🧠 Soft Skills Excellent written and verbal communication Strong analytical and problem-solving mindset A proactive, customer-first attitude Willingness to work in 24x7 rotational shifts
Posted 19 hours ago
7.0 - 9.0 years
0 Lacs
New Delhi, Delhi, India
On-site
The purpose of this role is to understand, model and facilitate change in a significant area of the business and technology portfolio either by line of business, geography or specific architecture domain whilst building the overall Architecture capability and knowledge base of the company. Job Description: Role Overview : We are seeking a highly skilled and motivated Cloud Data Engineering Manager to join our team. The role is critical to the development of a cutting-edge reporting platform designed to measure and optimize online marketing campaigns. The GCP Data Engineering Manager will design, implement, and maintain scalable, reliable, and efficient data solutions on Google Cloud Platform (GCP). The role focuses on enabling data-driven decision-making by developing ETL/ELT pipelines, managing large-scale datasets, and optimizing data workflows. The ideal candidate is a proactive problem-solver with strong technical expertise in GCP, a passion for data engineering, and a commitment to delivering high-quality solutions aligned with business needs. Key Responsibilities : Data Engineering & Development : Design, build, and maintain scalable ETL/ELT pipelines for ingesting, processing, and transforming structured and unstructured data. Implement enterprise-level data solutions using GCP services such as BigQuery, Dataform, Cloud Storage, Dataflow, Cloud Functions, Cloud Pub/Sub, and Cloud Composer. Develop and optimize data architectures that support real-time and batch data processing. Build, optimize, and maintain CI/CD pipelines using tools like Jenkins, GitLab, or Google Cloud Build. Automate testing, integration, and deployment processes to ensure fast and reliable software delivery. Cloud Infrastructure Management : Manage and deploy GCP infrastructure components to enable seamless data workflows. Ensure data solutions are robust, scalable, and cost-effective, leveraging GCP best practices. Infrastructure Automation and Management: Design, deploy, and maintain scalable and secure infrastructure on GCP. Implement Infrastructure as Code (IaC) using tools like Terraform. Manage Kubernetes clusters (GKE) for containerized workloads. Collaboration and Stakeholder Engagement : Work closely with cross-functional teams, including data analysts, data scientists, DevOps, and business stakeholders, to deliver data projects aligned with business goals. Translate business requirements into scalable, technical solutions while collaborating with team members to ensure successful implementation. Quality Assurance & Optimization : Implement best practices for data governance, security, and privacy, ensuring compliance with organizational policies and regulations. Conduct thorough quality assurance, including testing and validation, to ensure the accuracy and reliability of data pipelines. Monitor and optimize pipeline performance to meet SLAs and minimize operational costs. Qualifications and Certifications : Education: Bachelor’s or master’s degree in computer science, Information Technology, Engineering, or a related field. Experience: Minimum of 7 to 9 years of experience in data engineering, with at least 4 years working on GCP cloud platforms. Proven experience designing and implementing data workflows using GCP services like BigQuery, Dataform Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer. Certifications: Google Cloud Professional Data Engineer certification preferred. Key Skills : Mandatory Skills: Advanced proficiency in Python for data pipelines and automation. Strong SQL skills for querying, transforming, and analyzing large datasets. Strong hands-on experience with GCP services, including Cloud Storage, Dataflow, Cloud Pub/Sub, Cloud SQL, BigQuery, Dataform, Compute Engine and Kubernetes Engine (GKE). Hands-on experience with CI/CD tools such as Jenkins, GitHub or Bitbucket. Proficiency in Docker, Kubernetes, Terraform or Ansible for containerization, orchestration, and infrastructure as code (IaC) Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer Strong understanding of Agile/Scrum methodologies Nice-to-Have Skills: Experience with other cloud platforms like AWS or Azure. Knowledge of data visualization tools (e.g., Power BI, Looker, Tableau). Understanding of machine learning workflows and their integration with data pipelines. Soft Skills : Strong problem-solving and critical-thinking abilities. Excellent communication skills to collaborate with technical and non-technical stakeholders. Proactive attitude towards innovation and learning. Ability to work independently and as part of a collaborative team. Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 19 hours ago
8.0 years
0 - 1 Lacs
Gurgaon
On-site
NEW REQUIREMENT!! Position: Data Engineer Experience: 8+ years Location: Gurugram (WFO) Budget: 1.2LPM + GST Salary range : 90,000to 1,00,000 JD: Data Engineer Design, develop and own robust data pipelines, ensuring optimal performance, scalability, and maintainability. Design and implement Data Lake, Dare Warehouse and Lakehouse solutions with different architecture patterns. Ensure data quality, integrity and governance across all stages of data lifecycle. Monitor and optimize performance of data engineering pipelines. Contribute to design principles, best practices, and documentation. Collaborate closely with cross-functional teams to deeply understand business requirements, translating them into effective technical design, implementations that support the organization's data-driven initiatives. Provide mentorship and guidance to other team members of the data engineering team, promoting knowledge transfer, a culture of continuous learning and skills development. We are looking for: Bachelor's degree in computer science, Information Systems, or a related field. Master's degree is a plus. A seasoned Data Engineer with a minimum of 8+ years of experience. Deep experience in designing and building robust, scalable data pipelines – both batch and real-time using modern data engineering tools and frameworks. Proficiency in AWS Data Services (S3, Glue, Athena, EMR, Kinesis etc.). Strong grip on SQL queries, various file formats like Apache Parquet, Delta Lake, Apache Iceberg or Hudi and CDC patterns. Experience in stream processing frameworks like Apache Flink or Kafka Streams or any other distributed data processing frameworks like pySpark. Expertise in workflow orchestration using Apache Airflow. Strong analytical and problem-solving skills, with the ability to work independently in a fast-paced environment. In-depth knowledge of database systems (both relational and NoSQL) and experience with data warehousing concepts. Hands-on experience with data integration tools and a strong familiarity with cloud-based data warehousing and processing is highly desirable. Excellent communication and interpersonal skills, facilitating effective collaboration with both technical and non-technical stakeholders. A strong desire to stay current with emerging technologies and industry best practices in data landscape. Job Type: Full-time Pay: ₹90,000.00 - ₹100,000.00 per year Work Location: In person
Posted 19 hours ago
3.0 years
20 - 25 Lacs
Gurgaon
Remote
About Us: Sun King (Greenlight Planet) is a multinational, for-profit business that designs, distributes, and finances solar-powered home energy products, with an underserved population in mind: the 1.8 billion global consumers for whom the old-fashioned electrical grid is either unavailable or too expensive. Over a decade in business, the company is now a leading global brand in emerging markets across Asia and Sub-Saharan Africa. Greenlight’s Sun King™ products provide modern light and energy to 32 million people in more than 60 countries and have sold over 8 million products worldwide. From the company’s wide range of trusted Sun King™ solar lamps and home energy systems, to its innovative distribution partnerships, to its EasyBuy™ pay-as-you-go consumer financing model, Greenlight Planet continuously strives to meet the evolving needs of the off-grid market. Greenlight stays in touch with underserved consumers’ needs in part by operating its own direct- to-consumer sales network, including thousands of trusted sales agents (called as “Sun King Energy Officers”) in local communities across local communities. For Sun King Energy Officers, this is not only a good source of income and employment but also they become an important member of their community bring light and catering to local energy needs within their communities. Today, with over 2700+ full-time employees in 15 countries, we remain continuously impressed at how each new team member contributes unique and innovative solutions to the global off-grid challenge, from new product designs, to innovative sales and distribution strategies, to setting up better collection mechanisms, to better training strategies, to more efficient logistical and after- sales service systems. We listen closely to each other to improve our products, our service, and ultimately, the lives of underserved consumers. Job location: Gurugram (Hybrid) About the role: Sun King is looking for a self-driven Infrastructure engineer, who is comfortable working in a fast-paced startup environment and balancing the needs of multiple development teams and systems. You will work on improving our current IAC, observability stack, and incident response processes. You will work with the data science, analytics, and engineering teams to build optimized CI/CD pipelines, scalable AWS infrastructure, and Kubernetes deployments. What you would be expected to do: Work with engineering, automation, and data teams to work on various infrastructure requirements. Designing modular and efficient GitOps CI/CD pipelines, agnostic to the underlying platform. Managing AWS services for multiple teams. Managing custom data store deployments like sharded MongoDB clusters, Elasticsearch clusters, and upcoming services. Deployment and management of Kubernetes resources. Deployment and management of custom metrics exporters, trace data, custom application metrics, and designing dashboards, querying metrics from multiple resources, as an end-to-end observability stack solution. Set up incident response services and design effective processes. Deployment and management of critical platform services like OPA and Keycloak for IAM. Advocate best practices for high availability and scalability when designing AWS infrastructure, observability dashboards, implementing IAC, deploying to Kubernetes, and designing GitOps CI/CD pipelines. You might be a strong candidate if you have/are: Hands-on experience with Docker or any other container runtime environment and Linux with the ability to perform basic administrative tasks. Experience working with web servers (nginx, apache) and cloud providers (preferably AWS). Hands-on scripting and automation experience (Python, Bash), experience debugging and troubleshooting Linux environments and cloud-native deployments. Experience building CI/CD pipelines, with familiarity with monitoring & alerting systems (Grafana, Prometheus, and exporters). Knowledge of web architecture, distributed systems, and single points of failure. Familiarity with cloud-native deployments and concepts like high availability, scalability, and bottleneck. Good networking fundamentals — SSH, DNS, TCP/IP, HTTP, SSL, load balancing, reverse proxies, and firewalls. Good to have: Experience with backend development and setting up databases and performance tuning using parameter groups. Working experience in Kubernetes cluster administration and Kubernetes deployments. Experience working alongside sec ops engineers. Basic knowledge of Envoy, service mesh (Istio), and SRE concepts like distributed tracing. Setup and usage of open telemetry, central logging, and monitoring systems. Job Type: Full-time Pay: ₹2,000,000.00 - ₹2,500,000.00 per year Benefits: Cell phone reimbursement Flexible schedule Health insurance Internet reimbursement Provident Fund Work from home Application Question(s): What's your expected CTC? What's your notice period? What's your current CTC? Experience: AWS: 3 years (Required) Linux: 3 years (Required) Python: 2 years (Required) Work Location: In person
Posted 19 hours ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Kaseya® is the leading provider of complete IT infrastructure and security management solutions for Managed Service Providers (MSPs) and internal IT organizations worldwide powered by AI. Kaseya’s best-in-breed technologies allow organizations to efficiently manage and secure IT to drive sustained business success. Kaseya has achieved sustained, strong double-digit growth over the past several years and is backed by Insight Venture Partners www.insightpartners.com), a leading global private equity firm investing in high-growth technology and software companies that drive transformative change in the industries they serve. Founded in 2000, Kaseya currently serves customers in over 20 countries across a wide variety of industries and manages over 15 million endpoints worldwide. To learn more about our company and our award-winning solutions, go to www.Kaseya.com and for more information on Kaseya’s culture. Kaseya is not your typical company. We are not afraid to tell you exactly who we are and our expectations. The thousands of people that succeed at Kaseya are prepared to go above and beyond for the betterment of our customers. Job Title: Technical Support Engineer - L1 Location & Mode: Bangalore - Onsite (No Hybrid/ WFH) Duration: Full-Time Shift:24/7 Skills Required: Windows Server; Networking; TCP/IP; Active Directory Availability: Immediate / serving NP/ NP of 30-45 Days Logistics: Cabs, Food, and other benefits provided Job Description: We are looking for a Customer Support Engineer with strong and proven customer service experience in the IT market. We are looking for candidates with willingness to learn, a solution-oriented mindset and excellent communication skills. The Customer Support Engineer will provide world-class support service to our customers and ensure customer satisfaction. The candidate is expected to maintain a professional, courteous and customer service-focused attitude always. Required Skills: Minimum 3 years’ experience in a Technical Support Role Solid knowledge and hands-on experience in TCP/IP protocol stack including addressing. TCP/IP, routing, switching, cabling, internet protocols (BGP,ISIS, and OSPF), firewalls, VPNs, and load balancers. SQL databases. Microsoft Windows Server. General inter-networking (e.g Active Directory). Networking concepts and protocols such as DNS, DHCP, FTP, TFTP, HTTP, iptables and PXE booting. Commands/utilities including but not limited to Apache, FTP, telnet, SSH, SMTP, POP, IMAP. Preferred Skills: Successful completion of the Kaseya Certified Administrator Certification (KCA). Solid In-depth knowledge of Linux/Unix and Windows environment. Industry-accepted certifications or equivalent work experience in one or more of the following areas: A+ Hardware/Software Network + CCNA Virtualization (VMware, Hyper-V) Linux+ MCP, MCTS or MCITP (Windows Server2k8, Windows Server 2012) Server+ Responsibilities: To take complete ownership of the diagnosis and resolution of Product issues, ranging from simple to very complex To provide the Customer with the most suitable, relevant solution in the best manner To engage with Customers both in writing and verbally in the most courteous manner, through all stages of resolution To ensure thorough collection of case details and they’re recorded correctly and professionally To coordinate with multiple teams if necessary and follow-up tenaciously with them to get Customer issues resolved To study the trends in issues being raised and suggest changes in line with the trends To assist in the development and implementation of new or improved service delivery strategies and initiatives Ensure that knowledge is transferred and shared within the team Assisting in the maintenance of all process documentation that is relevant to the Product and its customers Work within the development cycle to assist with product enhancements and improvements General Skills: Customer Centric. Excellent listening skills. Excellent communication skills, both verbal and written English. Strong Organizational, prioritization, and multitasking skills. Excellent phone etiquette. Excellent time management; (i.e. ability to prioritize tickets and complete research on time). Ability to properly articulate ideas, suggestions, and provide positive/constructive feedback. Ability to work independently without direct supervision. Willingness to work with team members or group to achieve common goals. Willingness to liaise with other departments to achieve common goals. Join the Kaseya growth rocket ship and see how we are #ChangingLives ! Additional Information Kaseya provides equal employment opportunity to all employees and applicants without regard to race, religion, age, ancestry, gender, sex, sexual orientation, national origin, citizenship status, physical or mental disability, veteran status, marital status, or any other characteristic protected by applicable law.
Posted 19 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Apache is a widely used software foundation that offers a range of open-source software solutions. In India, the demand for professionals with expertise in Apache tools and technologies is on the rise. Job seekers looking to pursue a career in Apache-related roles have a plethora of opportunities in various industries. Let's delve into the Apache job market in India to gain a better understanding of the landscape.
These cities are known for their thriving IT sectors and see a high demand for Apache professionals across different organizations.
The salary range for Apache professionals in India varies based on experience and skill level. - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum
In the Apache job market in India, a typical career path may progress as follows: 1. Junior Developer 2. Developer 3. Senior Developer 4. Tech Lead 5. Architect
Besides expertise in Apache tools and technologies, professionals in this field are often expected to have skills in: - Linux - Networking - Database Management - Cloud Computing
As you embark on your journey to explore Apache jobs in India, it is essential to stay updated on the latest trends and technologies in the field. By honing your skills and preparing thoroughly for interviews, you can position yourself as a competitive candidate in the Apache job market. Stay motivated, keep learning, and pursue your dream career with confidence!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39928 Jobs | Dublin
Wipro
19400 Jobs | Bengaluru
Accenture in India
15955 Jobs | Dublin 2
EY
15128 Jobs | London
Uplers
11280 Jobs | Ahmedabad
Amazon
10521 Jobs | Seattle,WA
Oracle
9339 Jobs | Redwood City
IBM
9274 Jobs | Armonk
Accenture services Pvt Ltd
7978 Jobs |
Capgemini
7754 Jobs | Paris,France