Jobs
Interviews

7900 Hadoop Jobs - Page 32

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

vijayawada, andhra pradesh

On-site

As a Lead Data Engineer based in Vijayawada, Andhra Pradesh, you will be responsible for leveraging your extensive experience in data engineering and data architecture to design and develop end-to-end data solutions, data pipelines, and ETL processes. With a Bachelor's or Master's degree in Computer Science, Information Systems, or a related field, along with over 10 years of relevant experience, you will play a crucial role in ensuring the success of data projects. You will demonstrate your strong knowledge in data technologies such as Snowflake, Databricks, Apache Spark, Hadoop, Dbt, Fivetran, and Azure Data Factory. Your expertise in Python and SQL will be essential in tackling complex data challenges. Furthermore, your understanding of data governance, data quality, and data security principles will guide you in maintaining high standards of data management. In this role, your excellent problem-solving and analytical skills will be put to the test as you work both independently and collaboratively in an Agile environment. Your strong communication and leadership skills will be instrumental in managing projects, teams, and engaging in pre-sales activities. You will have the opportunity to showcase your technical leadership abilities by delivering solutions within defined timeframes and building strong client relationships. Moreover, your experience in complete project life cycle activities, agile methodologies, and working with globally distributed teams will be valuable assets in this position. Your proven track record of success in managing complex consulting projects and your ability to effectively communicate with technical and non-technical staff will contribute to the overall success of the team. If you are looking for a challenging role that combines technical expertise, leadership skills, and client engagement, this Lead Data Engineer position offers a dynamic opportunity to excel in a fast-paced and collaborative environment.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

You are a highly skilled Senior Big Data Engineer with 6 to 9 years of experience, seeking an opportunity to join the team at UST. Your expertise lies in Big Data technologies, cloud platforms, and data processing frameworks, focusing on PySpark and Google Cloud Platform (GCP). Your responsibilities will include designing, developing, and maintaining scalable data pipelines and ETL workflows using PySpark, Hadoop, and Hive. You will deploy and manage big data workloads on cloud platforms such as GCP and AWS. Collaboration with cross-functional teams to understand data requirements and deliver high-quality solutions is essential. Optimizing data processing jobs for performance and cost-efficiency on cloud infrastructure is a key aspect of your role. Implementing automation and CI/CD pipelines to streamline deployment and monitoring of data workflows is also part of your responsibilities. Ensuring data security, governance, and compliance in cloud environments is crucial, along with troubleshooting and resolving data issues while monitoring job executions and system health. Your mandatory skills include strong experience in PySpark for developing data processing jobs and ETL pipelines. Hands-on experience with Google Cloud Platform (GCP) services like BigQuery, Dataflow, Dataproc, or similar is required. Expertise with the Hadoop ecosystem, including Hadoop, Hive, and related big data tools, is essential. Familiarity with AWS data services such as S3, EMR, Glue, or Redshift is also mandatory, along with strong SQL and data modeling skills. Additionally, it is good to have experience with CI/CD tools and DevOps practices like Jenkins, GitLab, Terraform, etc. Knowledge of containerization and orchestration tools like Docker and Kubernetes is beneficial, as well as experience with Infrastructure as Code (IaC) and data governance and security best practices. Your skills include expertise in Spark, Hadoop, Hive, and GCP, making you a valuable asset for the team at UST.,

Posted 1 week ago

Apply

7.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Location: Noida / Indore / Bangalore / Hyderabad / Pune Salary - 30-32 LPA Experience - 7 to 10 Years Position Summary: We are looking for candidates with hands-on experience in Big Data or Cloud Technologies. Must have technical Skills 7 to 10 Years of experience Expertize and hands-on experience on Spark DataFrame, and Hadoop echo system components – Must Have Good and hand-on experience* of any of the Cloud (AWS/Azure/GCP) – Must Have Good knowledge of PySpark (SparkSQL) – Must Have Good knowledge of Shell script & Python – Good to Have Good knowledge of SQL – Good to Have Good knowledge of migration projects on Hadoop – Good to Have Good Knowledge of one of the Workflow engine like Oozie, Autosys – Good to Have Good knowledge of Agile Development– Good to Have Passionate about exploring new technologies – Good to Have Automation approach - – Good to Have Good Communication Skills – Must Have *Data Ingestion, Processing and Orchestration knowledge Roles & Responsibilities Responsibilities Lead technical implementation of Data Warehouse modernization projects for Impetus Design and development of applications on Cloud technologies Lead technical discussions with internal & external stakeholders Resolve technical issues for team Ensure that team completes all tasks & activities as planned Code Development

Posted 1 week ago

Apply

9.0 - 13.0 years

0 Lacs

karnataka

On-site

As a Senior ML Scientist in the Advertising Optimization & Automation Science team at Wayfair, you will play a crucial role in leveraging machine learning and generative AI to streamline campaign workflows. Based in Bangalore, India, you will contribute to building intelligent systems that drive personalized recommendations and campaign automation within Wayfair's advertising platform. Your responsibilities will include designing and implementing intelligent budget, tROAS, and SKU recommendations, as well as simulation-driven decisioning to enhance advertiser outcomes and unlock substantial commercial value. In collaboration with cross-functional teams, you will lead the development of GenAI-powered creative optimization and automation to drive incremental ad revenue and improve supplier outcomes. Additionally, you will elevate technical standards by promoting best practices in ML system design and development across the team. The ideal candidate for this role possesses a Bachelor's or Masters degree in Computer Science, Mathematics, Statistics, or a related field, along with 9+ years of experience in building large-scale machine learning algorithms. You should have a strong theoretical understanding of statistical models and ML algorithms, proficiency in programming languages such as Python, and experience with relevant ML libraries like TensorFlow and PyTorch. Strategic thinking, customer-centric mindset, and a desire for creative problem-solving are essential qualities for success in this position. Additionally, you should be adept at influencing senior-level stakeholders, possess excellent communication skills, and demonstrate the ability to shape technical roadmaps through cross-functional partnerships. Nice-to-have qualifications include experience with GCP, Airflow, and containerization, as well as familiarity with Generative AI and agentic workflows. Knowledge of Bayesian Learning, Multi-armed Bandits, or Reinforcement Learning is also advantageous. By joining Wayfair, you will be part of a dynamic and innovative team that is dedicated to reinventing the way people shop for their homes. If you are looking for a fast-paced environment that offers continuous learning and growth opportunities, Wayfair may be the perfect place for you to advance your career.,

Posted 1 week ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana

On-site

Location Gurugram, Haryana, India This job is associated with 2 categories Job Id GGN00002145 Information Technology Job Type Full-Time Posted Date 07/22/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description Description - External United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Our Values : At United Airlines, we believe that inclusion propels innovation and is the foundation of all that we do. Our Shared Purpose: "Connecting people. Uniting the world." drives us to be the best airline for our employees, customers, and everyone we serve, and we can only do that with a truly diverse and inclusive workforce. Our team spans the globe and is made up of diverse individuals all working together with cutting-edge technology to build the best airline in the history of aviation. With multiple employee-run "Business Resource Group" communities and world-class benefits like health insurance, parental leave, and space available travel, United is truly a one-of-a-kind place to work that will make you feel welcome and accepted. Come join our team and help us make a positive impact on the world. Job overview and responsibilities This role will be responsible for collaborating with the Business and IT teams to identify the value, scope, features and delivery roadmap for data engineering products and solutions. Responsible for communicating with stakeholders across the board, including customers, business managers, and the development team to make sure the goals are clear and the vision is aligned with business objectives. Perform data analysis using SQL Data Quality Analysis, Data Profiling and Summary reports Trend Analysis and Dashboard Creation based on Visualization technique Execute the assigned projects/ analysis as per the agreed timelines and with accuracy and quality. Complete analysis as required and document results and formally present findings to management Perform ETL workflow analysis, create current/future state data flow diagrams and help the team assess the business impact of any changes or enhancements Understand the existing Python code work books and write pseudo codes Collaborate with key stakeholders to identify the business case/value and create documentation. Should have excellent communication and analytical skills. This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. United Airlines is an equal opportunity employer. United Airlines recruits, employs, trains, compensates, and promotes regardless of race, religion, color, national origin, gender identity, sexual orientation, physical ability, age, veteran status, and other protected status as required by applicable law. Qualifications - External Required BE, BTECH or equivalent, in computer science or related STEM field 5+ years of total IT experience as either a Data Analyst/Business Data Analyst or as a Data Engineer 2+ years of experience with Big Data technologies like PySpark, Hadoop, Redshift etc. 3+ years of experience with writing SQL queries on RDBMS or Cloud based database Experience with Visualization tools such as Spotfire, PowerBI, Quicksight etc Experience in Data Analysis and Requirements Gathering Strong problem-solving skills Creative, driven, detail-oriented focus, requiring tackling of tough problems with data and insights. Natural curiosity and desire to solve problems. Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English and Hindi (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Qualifications Preferred AWS Certification preferred Strong experience with continuous integration & delivery using Agile methodologies Data engineering experience with transportation/airline industry

Posted 1 week ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India . Minimum qualifications: Bachelor's degree in Computer Science or related technical field, or equivalent practical experience. Experience building data and AI solutions and working with technical customers. Experience designing cloud enterprise solutions and supporting customer projects to completion. Ability to communicate in English fluently to support client relationship management in this region. Preferred qualifications: Experience working with Large Language Models, data pipelines, and with data analytics, data visualization techniques. Experience with core Data ETL techniques. Experience in leveraging LLMs to deploy multimodal solutions encompassing Text, Image, Video and Voice. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools and environments (Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume). Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. About the job The Google Cloud Consulting Professional Services team guides customers through the moments that matter most in their cloud journey to help businesses thrive. We help customers transform and evolve their business through the use of Google’s global network, web-scale data centers, and software infrastructure. As part of an innovative team in this rapidly growing business, you will help shape the future of businesses of all sizes and use technology to connect with customers, employees, and partners. As a Cloud Engineer, you'll play a key role in ensuring that strategic customers have the best experience moving to the Google Cloud GenAI and Agentic AI suite of products. You will design and implement solutions for customer use cases, leveraging core Google products. You'll work with customers to identify opportunities to transform their business with GenAI, and deliver workshops designed to educate and empower customers to realize the full potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product issues, and address customer and partner needs. In this role, you will lead the timely execution of adopting the Google Cloud Platform solutions to the customer. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver effective big data and GenAI solutions and solve complex technical customer challenges. Act as a trusted technical advisor to Google’s strategic customers. Identify new product features and feature gaps, provide guidance on existing product challenges, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver best practices recommendations, tutorials, blog articles, and technical presentations adapting to different levels of key business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 1 week ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana

On-site

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India . Minimum qualifications: Bachelor's degree in Computer Science or related technical field, or equivalent practical experience. Experience building data and AI solutions and working with technical customers. Experience designing cloud enterprise solutions and supporting customer projects to completion. Ability to communicate in English fluently to support client relationship management in this region. Preferred qualifications: Experience working with Large Language Models, data pipelines, and with data analytics, data visualization techniques. Experience with core Data ETL techniques. Experience in leveraging LLMs to deploy multimodal solutions encompassing Text, Image, Video and Voice. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools and environments (Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume). Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. About the job The Google Cloud Consulting Professional Services team guides customers through the moments that matter most in their cloud journey to help businesses thrive. We help customers transform and evolve their business through the use of Google’s global network, web-scale data centers, and software infrastructure. As part of an innovative team in this rapidly growing business, you will help shape the future of businesses of all sizes and use technology to connect with customers, employees, and partners. As a Cloud Engineer, you'll play a key role in ensuring that strategic customers have the best experience moving to the Google Cloud GenAI and Agentic AI suite of products. You will design and implement solutions for customer use cases, leveraging core Google products. You'll work with customers to identify opportunities to transform their business with GenAI, and deliver workshops designed to educate and empower customers to realize the full potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product issues, and address customer and partner needs. In this role, you will lead the timely execution of adopting the Google Cloud Platform solutions to the customer. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver effective big data and GenAI solutions and solve complex technical customer challenges. Act as a trusted technical advisor to Google’s strategic customers. Identify new product features and feature gaps, provide guidance on existing product challenges, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver best practices recommendations, tutorials, blog articles, and technical presentations adapting to different levels of key business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Big Data - Data Modeller, you will play a crucial role in leading moderately complex initiatives and deliverables within technical domain environments. Your responsibilities will include contributing to large-scale planning of strategies, designing, coding, testing, debugging, and documenting projects and programs associated with technology domain, including upgrades and deployments. Additionally, you will review technical challenges that require an in-depth evaluation of technologies and procedures, resolve complex issues, and lead a team to meet existing client needs or potential new clients needs while leveraging a solid understanding of the function, policies, procedures, or compliance requirements. Collaborating and consulting with peers, colleagues, and mid-level managers to resolve technical challenges and achieve goals will be a key aspect of your role. You will also lead projects, act as an escalation point, provide guidance and direction to less experienced staff. Desired Qualifications: - Minimum of 6 years of hands-on experience in Big Data Software Enterprise Application development - Proficiency in continuous integration and delivery practices when developing code - Experience in the Banking/Financial technology domain will be preferred Job Expectations: - Collaborate with scrum stakeholders to implement modernized and sustainable technology roadmaps - Analyze technical requirements and implement software solutions - Stay updated on current and future trends and practices in Technology - Resolve application issues with software solutions and respond to suggestions for improvements and enhancements - Proactively manage risk through the implementation of appropriate controls and escalate where required - Work with Engineering manager, product owner, and Team to ensure the product is delivered with quality, on time, and within budget - Coordinate project interlocks and deployments with internal IT Teams - Possess strong verbal and written communication skills to effectively work in a global development environment This is a full-time position with a day shift schedule, requiring in-person work at the Bangalore location. The application deadline for this opportunity is 08/08/2025.,

Posted 1 week ago

Apply

2.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

Tiger Analytics is a global AI and analytics consulting firm with a team of over 2800 professionals focused on using data and technology to solve complex problems that impact millions of lives worldwide. Our culture is centered around expertise, respect, and a team-first mindset. Headquartered in Silicon Valley, we have delivery centers globally and offices in various cities across India, the US, UK, Canada, and Singapore, along with a significant remote workforce. At Tiger Analytics, we are certified as a Great Place to Work. Joining our team means being at the forefront of the AI revolution, working with innovative teams that push boundaries and create inspiring solutions. We are currently looking for an Azure Big Data Engineer to join our team in Chennai, Hyderabad, or Bangalore. As a Big Data Engineer (Azure), you will be responsible for building and implementing various analytics solutions and platforms on Microsoft Azure using a range of Open Source, Big Data, and Cloud technologies. Your typical day might involve designing and building scalable data ingestion pipelines, processing structured and unstructured data, orchestrating pipelines, collaborating with teams and stakeholders, and making critical tech-related decisions. To be successful in this role, we expect you to have 4 to 9 years of total IT experience with at least 2 years in big data engineering and Microsoft Azure. You should be proficient in technologies such as Azure Data Factory (ADF), PySpark, Databricks, ADLS, Azure SQL Database, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Strong coding skills in SQL, Python, or Scala/Java are essential, as well as experience with big data technologies like Hadoop, Spark, Airflow, NiFi, Kafka, Hive, Neo4J, and Elastic Search. Knowledge of file formats such as Delta Lake, Avro, Parquet, JSON, and CSV is also required. Ideally, you should have experience in building REST APIs, working on Data Lake or Lakehouse projects, supporting BI and Data Science teams, and following Agile and DevOps processes. Certifications like Data Engineering on Microsoft Azure (DP-203) or Databricks Certified Developer (DE) would be a valuable addition to your profile. At Tiger Analytics, we value diversity and inclusivity, and we encourage individuals with different skills and qualities to apply, even if they do not meet all the criteria for the role. We are committed to providing equal opportunities and fostering a culture of listening, trust, respect, and growth. Please note that the job designation and compensation will be based on your expertise and experience, and our compensation packages are competitive within the industry. If you are passionate about leveraging data and technology to drive impactful solutions, we would love to stay connected with you.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

The Applications Development Senior Programmer Analyst position entails participating in the establishment and implementation of new or revised application systems and programs in collaboration with the Technology team. Your main objective in this role is to contribute to applications systems analysis and programming activities. Responsibilities include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will be responsible for monitoring and controlling all phases of the development process, including analysis, design, construction, testing, and implementation. Providing user and operational support on applications to business users is also a key aspect of your role. You will utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, evaluate business processes, system processes, and industry standards, and make evaluative judgments. Additionally, you will recommend and develop security measures in post-implementation analysis of business usage to ensure successful system design and functionality. Consulting with users/clients and other technology groups on issues, recommending advanced programming solutions, and installing and assisting customer exposure systems are also part of your responsibilities. Ensuring that essential procedures are followed, helping define operating standards and processes, and serving as an advisor or coach to new or lower-level analysts are essential tasks in this role. You will be expected to operate with a limited level of direct supervision, exercise independence of judgment and autonomy, and act as a subject matter expert to senior stakeholders and/or other team members. Qualifications for this position include: - 8 to 12 years of Application development experience with Java / J2EE technologies. - Experience with Core Java/J2EE Application with complete command over OOPs and Design Patterns. - Proficiency in Data Structures and Algorithms. - Thorough knowledge and hands-on experience with technologies such as BIG data Hadoop knowledge with experience on Hive or Java-based Spark Programming. - Implementation or part of complex project execution in Big Data Spark ecosystem. - Working in an agile environment following best practices of agile Scrum. - Expertise in designing and optimizing software solutions for performance and stability. - Strong troubleshooting and problem-solving skills. - Experience in Test-driven development. Education required for this role: - Bachelors degree/University degree or equivalent experience This is a full-time position in the Technology Job Family Group, specifically within the Applications Development Job Family.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

punjab

On-site

We are looking for a highly analytical and experienced Data Scientist to join our team. As a Data Scientist, you will be responsible for analyzing large datasets, identifying trends, and building data products to extract valuable business insights. If you have a passion for turning raw data into actionable intelligence and possess strong analytical and problem-solving skills, we invite you to apply. Your responsibilities will include identifying valuable data sources and automating collection processes, implementing NLP algorithms for extracting insights from unstructured data, preprocessing structured and unstructured data, exploring innovative deep learning approaches, analyzing information to discover trends, building predictive models and machine-learning algorithms, utilizing ensemble modeling techniques, presenting information using data visualization, and proposing solutions to address business challenges. Collaboration with engineering and product development teams will be key. To be successful in this role, you should have proven experience as a Data Scientist, expertise in applying machine learning, NLP, computer vision, and deep learning techniques, proficiency in programming languages such as R, SQL, and Python, experience with business intelligence tools and data frameworks, strong math skills, exceptional problem-solving aptitude, and excellent communication and presentation skills. A B.Tech in Computer Science or a related quantitative field is preferred.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

maharashtra

On-site

As a Cloud & AI Solution Engineer at Microsoft, you will be part of a dynamic team that is at the forefront of innovation in the realm of databases and analytics. Your role will involve working on cutting-edge projects that leverage the latest technologies to drive meaningful impact for commercial customers. If you are insatiably curious and deeply passionate about tackling complex challenges in the era of AI, this is the perfect opportunity for you. In this role, you will play a pivotal role in helping enterprises unlock the full potential of Microsoft's cloud database and analytics stack. You will collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics. Your responsibilities will include hands-on engagements such as Proof of Concepts, hackathons, and architecture workshops to guide customers through secure, scalable solution design and accelerate database and analytics migration into their deployment workflows. To excel in this position, you should have at least 10+ years of technical pre-sales or technical consulting experience, or a Bachelor's/Master's Degree in Computer Science or related field with 4+ years of technical pre-sales experience. You should be an expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) and Azure Analytics (Fabric, Azure Databricks, Purview), as well as competitors in the data warehouse, data lake, big data, and analytics space. Additionally, you should have experience with cloud and hybrid infrastructure, architecture designs, migrations, and technology management. As a trusted technical advisor, you will guide customers through solution design, influence technical decisions, and help them modernize their data platform to realize the full value of Microsoft's platform. You will drive technical sales, lead hands-on engagements, build trusted relationships with platform leads, and maintain deep expertise in Microsoft's Analytics Portfolio and Azure Databases. By joining our team, you will have the opportunity to accelerate your career growth, develop deep business acumen, and hone your technical skills. You will be part of a collaborative and creative team that thrives on continuous learning and flexible work opportunities. If you are ready to take on this exciting challenge and be part of a team that is shaping the future of cloud Database & Analytics, we invite you to apply and join us on this journey.,

Posted 1 week ago

Apply

0 years

0 Lacs

Greater Chennai Area

On-site

OneMagnify is a global performance marketing organization working at the intersection of brand marketing, technology, and analytics. The Company’s core offerings accelerate business, amplify real-time results, and help set their clients apart from their competitors. OneMagnify partners with clients to design, implement and manage marketing and brand strategies using analytical and predictive data models that provide valuable customer insights to drive higher levels of sales conversion. OneMagnify’s commitment to employee growth and development extends far beyond typical approaches. We take great pride in fostering an environment where each of our 700+ colleagues can thrive and achieve their personal best. OneMagnify has been recognized as a Top Workplace, Best Workplace and Cool Workplace in the United States for 10 consecutive years and recently was recognized as a Top Workplace in India. About You: You have a passion for data, a keen attention to detail and an analytical mindset. You are a problem-solver that enjoys working collaboratively to use data insight to drive business solutions. What you’ll do: Works under mentorship of Senior Data Analyst and/or Ford supervisors to support client objectives and project goals by developing data-driven and insightful reports, visualizations, dashboards, and communicating results within project lifecycle guidelines, using appropriate programming languages & visualization tools. What you’ll need: Must be experienced in solving complex data and analytics issues through data manipulation, analysis, presentation, and reporting. Responsible for multitasking between ad hoc and project-based deliverables. Bachelor’s or Master’s degree in a technical field (e.g., Computer Science, Information Systems, Mathematics, Statistics) Required technical skills: SQL, Alteryx, QlikView; plus familiarity with Qlik Sense, Hadoop, Teradata, Python, SAS, R, and dashboarding tools such as Tableau or Spotfire Benefits We offer a comprehensive benefits package including Medical Insurance, PF, Gratuity, paid holidays, and more. About Us Whether it’s awareness, advocacy, engagement, or efficacy, we move brands forward with work that connects with audiences and delivers results. Through meaningful analytics, engaging communications and innovative technology solutions, we help clients tackle their most ambitious projects and overcome their biggest challenges. We are an equal opportunity employer We believe that Innovative ideas and solutions start with unique perspectives. That’s why we’re committed to providing every employee a workplace that’s free of discrimination and intolerance. We’re proud to be an equal opportunity employer and actively search for like-minded people to join our team.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Support Engineer at Precisely, you will play a crucial role in driving solutions to complex issues and ensuring the success of our customers. Your technical expertise will be essential in supporting Precisely Data Integration investments, including various products and Sterling B2B Integrator. Your problem-solving skills, technical depth, effective communication, and ability to innovate will be key attributes for excelling in this role. In this position, your responsibilities will include providing top-notch technical support via phone, email, and remote desktop connections, meeting SLA requirements, updating stakeholders promptly, and documenting critical information. You will be tasked with swiftly resolving issues to guarantee customer satisfaction, investigating and solving complex problems across different platforms, software systems, and databases. Your understanding of enterprise systems will be pivotal in identifying the root cause of issues and recommending suitable solutions. Continuous learning and knowledge sharing are integral parts of this role. You will be expected to stay updated on new technologies, tools, and systems and share your insights with the team. Developing comprehensive internal and external Knowledge Base documentation will be essential for enhancing customer and team support. Additionally, you will contribute to debugging, suggesting solutions, and tools for product improvements. Requirements for this role include a Bachelor's or Master's degree in Computer Science or a related field, exceptional communication skills, strong analytical abilities, and a self-motivated approach to problem-solving. A keen interest in learning new technologies, understanding software design principles, and proficiency in database management systems and networking design are essential. Experience with debugging, object-oriented languages, distributed computing, and various technologies will be advantageous. If you are enthusiastic about tackling challenging problems, working under tight deadlines, and providing excellent technical support, this role at Precisely offers a rewarding opportunity to grow and contribute to a leading organization in data integrity.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

The Software Engineering team at Dell Technologies is dedicated to delivering next-generation application enhancements and new products to meet the evolving needs of the world. As a Software Principal Engineer in Bangalore, you will be at the forefront of designing and developing software using cutting-edge technologies, tools, and methodologies in collaboration with both internal and external partners. Your primary responsibility will be to develop sophisticated systems and software solutions aligned with our customers" business goals and requirements. You will work closely with business stakeholders, conduct data analysis, ETL tasks, and data administration independently. Additionally, you will design, develop, and maintain scalable ETL pipelines, collaborating with various teams to ensure data requirements are met. It will be essential to stay updated on industry trends, mentor junior data engineers, and provide technical guidance as needed. To excel in this role, you should have a minimum of 8 years of industry experience with a focus on advanced ETL skills, including proficiency in tools like Airflow, ControlM, or Informatica. Strong Python programming skills for data manipulation, along with a solid understanding of SQL and NoSQL databases, are required. Experience with big data technologies such as Hadoop, Spark, or Kafka, as well as familiarity with cloud platforms like AWS, Azure, ADF Synapse, and SQLserver, will be beneficial. Desirable qualifications include experience with product data and product lifecycle management, as well as working knowledge of data visualization tools like Tableau or Power BI. At Dell Technologies, we believe in the power of each team member to make a positive impact. We prioritize our team members" growth and development, offering opportunities to work with cutting-edge technology and some of the industry's best minds. If you are seeking a rewarding career where you can contribute to a future that benefits everyone, we invite you to join us. Application closing date: 30 July 2025 Dell Technologies upholds the principle of equal employment opportunity and is committed to providing a work environment free of discrimination and harassment for all employees. If you are ready to take the next step in your career with us, we look forward to welcoming you to our team.,

Posted 1 week ago

Apply

10.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Job Title : IT Lead Reporting Structure : Reports to Manager - IT Infra (Navi Mumbai location) Education : Bachelor's degree in any engineering discipline/ BCA/ MCA Qualifications Minimum 7 and maximum 10 years of hands-on experience in managing overall IT day-to-day operations to improve infrastructure costs, performance and end- user satisfaction. Able to work with cutting-edge technology and assimilate information rapidly. Individual thinker and ability to work independently with little or no oversight. Strong interpersonal and articulation skills both spoken and written. Strong problem-solving and communication skills Practical experience in building IT infrastructure strategy in collaboration with business departments and vendors. Managing IT staff and IT Infra equipment with in-depth knowledge of servers, databases, storage, network, backup and HW/SW components. Should possess knowledge of RedHat OpenShift Container platform along with knowledge of Kubernetes and OS administration of RHEL & Windows. Practical experience in handling IT infrastructure with working knowledge of replication, storage provisioning, IP subnetting, backup policies and OS hardening. Working knowledge of Cloudera in Hadoop (Big Data) environment. Should have practical experience in managing on-premises IT Infrastructure & hosting of application in on-premises DC. Should have service management knowledge of handling incident, changes, problem with ability to troubleshoot complex problems by providing RCA. Should translate complex ideas for non-technical staff/customers by empowering learning and knowledge transfer. - Should have good knowledge of vendor, stakeholder and SLA management. Analytical thinking, communication, teamwork, relationship management, subject matter expertise, Soft skills and service delivery commitments. Industry Technology, ITES, Shared Services or Banking organization Responsibilities To Build infrastructure strategy in collaboration with business departments. Involve in all IT Infra related activities and Infra RFP creation, vendor evaluation. Assist vendor team with infra related discussions and defining the infrastructure requirement. Responsible for overseeing the daily operations of the technical support team as well as participating as an active member of the team. Maintain/track records of daily reporting problems & resolutions and actions taken for user issues. Provide timely resolution for ITSM service tickets raised to maintain SLA commitments. Check, update, and track IT Asset inventory. Work to create any relevant support material for the team. Ensure that all customer inquiries and issues are solved correctly and in a prompt and professional manner. Review all technical support-related processes and documentation for continuous improvement. Manage staff and network/server equipment. Should participate and manage the DR-Drill along with documentation of BCP plans and creation of SOPs for various IT process. Understand the application flow and support to troubleshoot application related issues. Follow change management & organization process. Actively participate in planning & execution of activities, Collaborating with the team on future direction and opportunities for new technology. Should provide regular update to senior management & client with good communication skills. Must Have Knowledge in ITSM tools, patch, backup, archival & restoration management, operating system, Antivirus, Security solutions In depth knowledge of servers and networks, data organization, Linux & Microsoft server. Good To Have Conceptual knowledge of Big data & RDBMS. (CCNA or ITIL or RHEL, SAN/NAS, CCNA, Comptia, Microsoft Certified preferred.) Employment Type All positions are on fixed term contract on a full-time basis exclusively for ReBIT, initially for a period of five years, extendable by mutual consent Location : Navi Mumbai (Kharghar/ Belapur) (ref:hirist.tech)

Posted 1 week ago

Apply

13.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Tech Specialist with strong analytical and technical ability with over 13 years of experience in enterprise Web applications, REST services and Workflow Processing Service development using Java/J2EE technologies. Experienced in working on medium to large enterprise projects, preferably in financial services Knowledge/Experience: Should have hands on experience on designing & development of scalable, distributed applications. Architect large scale of applications using spark, kafka & big data technologies Knowledge of Hadoop architecture. Knowledge of frameworks – Velocity, Springs, Spring Boot. Knowledge of OOAD, UML and design Patterns Should have strong insight on OOPS concept and good hands on experience on Java (version 1.8 or above) and other java-based frameworks like Spring Batch, Spring IOC, Spring Annotation, Spring Security. Should have hands on experience on messaging platform like Kafka. Good working knowledge of JBPM as BPMN Framework is must. Good working knowledge of Docker, Kubernetes and OpenShift is a must. Should have strong knowledge of Java design patterns, microservice design patterns, event streams, event/message-based architecture, Domain driven design etc. Should have strong knowledge of API based architecture and SOA. Expertise in Server less, tomcat (Embedded/Non-Embedded), jetty (Embedded/Non-Embedded), WebSphere, Spring Batch, Spring IOC, Spring Annotation, Spring Security Expertise in mocking, Junit and perf testing of solutions. Should possess basic Unix/Linux knowledge to be able to write and understand basic shell scripts and basic Unix commands Good working knowledge of in memory distributed caches (Hazelcast, Gemfire) is good to have. Person should have worked in Agile/DevOps Environment. knowledge on webserver setup and configuration with reverse proxy/ssl setup etc (preferred nginx webserver) is a plus Good to have skills: Financial markets background is preferable but is not a must Knowledge of testing concepts (TDD, BDD) is a plus. Knowledge of ELK/App Dynamics Knowledge of other programming languages like Vaadin (UI Framework), Kotlin, scala, shell scripting etc is good to have. Key Responsibilities: A seasoned SME and technical specialist in Client On boarding/AML/KYC/Account Opening domain Translate business requirements into technical documents/code Employ standards, frameworks and patterns while designing and developing components. Implement appropriate design standards, frameworks and patterns while designing and developing components Implement and maintain a suite of Workflow driven, Java application with RESTful services. Develop high quality code employing software engineering and testing best practices. Developing software that processes, persists and distributes data via relational and non-relational technologies Hands on coding, authoring unit tests/Junit, performance tests and maintaining high code quality. Needs to be able to react and provide quick turnaround to business requirements and management requests Well versed in Agile Development Life Cycle and capable to lead a team of developers. Partner with database developers to implement ingestion, orchestration, quality/reconciliation and distribution services Ability to work independently, good communication skills, has experience in working on complex and medium to large projects. Job Background: The position is based in India and is required to focus on delivery of the work, ensuring a robust design This role may report to the technology team lead based in Pune Candidate should be able to work independently and should be self-motivated Candidate might be required to work with vendors or third parties in joint delivery teams The role requires application of technical skills and knowledge of the business to develop solutions to meet business needs As part of large, geographically distributed team(s), the candidate may have to manage stakeholders across multiple functional areas The position requires analytical skills in order to filter, prioritize and validate potentially complex material, technical or business or otherwise, from multiple sources. The candidate will work with complex and variable issues with substantial potential impact, weighing various alternatives and balancing potentially conflicting needs Qualification: Bachelor’s degree (in science, computers, information technology or engineering) Candidate should be willing to work late in the evening India time on need basis in order to interact with US/other global teams ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Summary: We are seeking an experienced [7+ years] and to join our Production/Application support team. The ideal candidate will bring a blend of good technical skills [ Unix, SQL, ITIL, Autosys, Big Data ]. The ideal candidate will bring a blend of strong technical skills [ Unix, SQL, Big Data technologies ] and good domain expertise in financial services [e.g: Securities, secured financing , rates, Liquidity reporting , Derivatives , front office/back-office system , trading lifecycle] Key Responsibilities: Provide L2 production support for mission critical liquidity reporting and financial applications, ensuring high availability and performance. Monitor and resolve incidents related to trade capture, batch failures, position keeping, market data, pricing, risk and liquidity reporting. Proactively manage alerts, logs and jobs using Autosys, Unix tools, and monitoring platforms [ ITRS/AWP ]. Execute advance SQL queries and scripts for data analysis, validation, and issue resolution. Support multiple applications build on stored proc, SSIS, SSRS, Big data ecosystems [ hive, spark, Hadoop] and troubleshoot data pipeline issues. Maintain and improve knowledge bases, SOPs, and runbooks for production support. Participate in change management and release activities, including deployment validations. Lead root cause analysis [ RCA] , conduct post incident reviews, and drive permanent resolutions. Collaborate with infrastructure teams on capacity, performance, and system resilience initiatives. Contribute to continuous service improvement, stability management and automation initiatives. Required Skills & Qualification: Bachelor’s or master’s degree in computer science, Information Technology, Engineering, or a related field. 7+ Years of experience in application or production support with 2+ years at a Advance level. Strong hands-on experience with. Unix/Linus [ scripting, file manipulation, job control] SQL [ MSSQL/Oracle or similar, Stored proc, SSIS, SSRS ] Big Data technologies [ Hadoop, Hive, Spark ] Job Schedulers like Autosys. Log analysis tools. Solid understanding of financial instruments and trade lifecycle [ Equities, Fixed incomes, Secured Financing, Derivatives, Liquidity management ] Knowledge of front office/back office and reporting workflows and operations. Excellent analytical and problem-solving skills , with the ability to work in a time-sensitive environment. Effective communication and stakeholder management skills across business and technical teams. Experience with ITIL processes, including incident, problem and change management. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Support ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

We Are Hiring : Data Engineer | 5+ Years Experience Job Description Job Title : Data Engineer Location : Ahmedabad Work Mode : On-Site Opportunity Experience : 5+ Years Employment Type : Full-Time Availability : Immediate Joiner Preferred Join Our Team as a Data Engineer We are seeking a passionate and experienced Data Engineer to be a part of our dynamic and forward-thinking team in Ahmedabad. This is an exciting opportunity for someone who thrives on transforming raw data into powerful insights and building scalable, high-performance data infrastructure. As a Data Engineer, you will work closely with data scientists, analysts, and cross-functional teams to design robust data pipelines, optimize data systems, and enable data-driven decision-making across the organization. Your Key Responsibilities Architect, build, and maintain scalable and reliable data pipelines from diverse data sources. Design effective data storage, retrieval mechanisms, and data models to support analytics and business needs. Implement data validation, transformation, and quality monitoring processes. Collaborate with cross-functional teams to deliver impactful, data-driven solutions. Proactively identify bottlenecks and optimize existing workflows and processes. Provide guidance and mentorship to junior engineers in the team. Skills & Expertise Were Looking For 4+ years of hands-on experience in Data Engineering or related roles. Strong expertise in Python and data pipeline design. Experience working with Big Data tools like Hadoop, Spark, Hive. Proficiency with SQL, NoSQL databases, and data warehousing solutions. Solid experience in cloud platforms - Azure Familiar with distributed computing, data modeling, and performance tuning. Understanding of DevOps, Power Automate, and Microsoft Fabric is a plus. Strong analytical thinking, collaboration skills, Excellent Communication Skill and the ability to work independently or as part of a team. Qualifications Bachelors degree in Computer Science, Data Science, or a related field (ref:hirist.tech)

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a skilled and experienced Database Administrator (DBA), you will be responsible for managing and supporting our database environments to ensure optimal performance, integrity, and security. Working closely with other IT team members and stakeholders, you will play a crucial role in ensuring that our data systems operate efficiently and meet the business needs. Your qualifications include a Bachelor's degree in Computer Science, Information Technology, or a related field. A Master's degree or relevant certifications such as Oracle DBA or Microsoft SQL Server Certified would be a plus. With at least 5+ years of proven experience in managing database systems, you should have hands-on experience with major DBMS platforms like Oracle, SQL Server, MySQL, PostgreSQL, and MongoDB. Proficiency in SQL for querying and managing databases, along with knowledge of database design, data modeling, and normalization, is essential. Your responsibilities will include installing, configuring, and maintaining database software and related tools, monitoring database performance, and ensuring optimal resource utilization. Additionally, you will perform routine maintenance tasks, implement database security measures, and analyze performance metrics to identify bottlenecks and improve query efficiency. Strong analytical and problem-solving skills, excellent communication abilities, and the capacity to manage multiple tasks and projects simultaneously are required. Experience with cloud-based database services like AWS RDS, Google Cloud SQL, and big data technologies such as Hadoop would be beneficial. You will also participate in database design and data modeling activities, ensure data integrity through normalization and data validation, and develop and maintain documentation including data dictionaries and schema diagrams. Implementing robust backup and recovery procedures, managing disaster recovery planning, enforcing database security policies, and ensuring compliance with data privacy regulations are crucial aspects of your role. Collaboration with developers, system administrators, and stakeholders to ensure seamless database integration, as well as providing technical support and troubleshooting for database-related issues, will be part of your everyday tasks. Additionally, you may need to participate in on-call rotations and respond to critical database incidents to maintain the efficiency and security of our database systems.,

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us as a Data Engineer at Barclays, where you'll take part in the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As a part of the team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. You'll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Data Engineer you should have experience with: Strong experience with ETL tools such as Ab Initio, Glue, PySpark, Python, DBT, DataBricks and various AWS required services / products. Advanced SQL knowledge across multiple database platforms (Teradata , Hadoop, SQL etc.) Experience with data warehousing concepts and dimensional modeling. Proficiency in scripting languages (Python, Perl, Shell scripting) for automation. Knowledge of big data technologies (Hadoop, Spark, Hive) is highly desirable. Bachelor's degree in Computer Science, Information Systems, or related field. Experience in ETL development and data integration. Proven track record of implementing complex ETL solutions in enterprise environments. Experience with data quality monitoring and implementing data governance practices. Knowledge of cloud data platforms (AWS, Azure, GCP) and their ETL services. Some Other Highly Valued Skills Include Strong analytical and problem-solving skills. Ability to work with large and complex datasets. Excellent documentation skills. Attention to detail and commitment to data quality. Ability to work independently and as part of a team. Strong communication skills to explain technical concepts to non-technical stakeholders. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a talented Big Data Engineer, you will be responsible for developing and managing our company's Big Data solutions. Your role will involve designing and implementing Big Data tools and frameworks, implementing ELT processes, collaborating with development teams, building cloud platforms, and maintaining the production system. To excel in this position, you should possess in-depth knowledge of Hadoop technologies, exceptional project management skills, and advanced problem-solving abilities. A successful Big Data Engineer comprehends the company's needs and establishes scalable data solutions to meet current and future requirements effectively. Your responsibilities will include meeting with managers to assess the company's Big Data requirements, developing solutions on AWS utilizing tools like Apache Spark, Databricks, Delta Tables, EMR, Athena, Glue, and Hadoop. You will also be involved in loading disparate data sets, conducting pre-processing services using tools such as Athena, Glue, and Spark, collaborating with software research and development teams, building cloud platforms for application development, and ensuring the maintenance of production systems. The requirements for this role include a minimum of 5 years of experience as a Big Data Engineer, proficiency in Python & PySpark, expertise in Hadoop, Apache Spark, Databricks, Delta Tables, and AWS data analytics services. Additionally, you should have extensive experience with Delta Tables, JSON, Parquet file formats, familiarity with AWS data analytics services like Athena, Glue, Redshift, EMR, knowledge of Data warehousing, NoSQL, and RDBMS databases. Good communication skills and the ability to solve complex data processing and transformation-related problems are essential for success in this role.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

We are looking for a Java Developer to produce scalable software solutions on distributed systems like Hadoop using Spark Framework. You will be part of a cross-functional team responsible for the full software development life cycle, from conception to deployment. As a Developer, you should be comfortable with back-end coding, development frameworks, third party libraries, and Spark APIs required for application development on distributed platforms like Hadoop. Being a team player with a knack for visual design and utility is essential. Familiarity with Agile methodologies will be an added advantage. A large part of the workloads and applications will be cloud-based, so knowledge and experience with Google Cloud Platform (GCP) will be handy. As part of our flexible scheme, here are some of the benefits you'll enjoy: - Best in class leave policy - Gender-neutral parental leaves - 100% reimbursement under childcare assistance benefit (gender-neutral) - Sponsorship for industry-relevant certifications and education - Employee Assistance Program for you and your family members - Comprehensive Hospitalization Insurance for you and your dependents - Accident and Term life Insurance - Complementary Health screening for 35 years and above Your key responsibilities will include working with development teams and product managers to ideate software solutions, designing client-side and server-side architecture, building features and applications capable of running on distributed platforms and/or the cloud, developing and managing well-functioning applications supporting micro-services architecture, testing software for responsiveness and efficiency, troubleshooting, debugging, and upgrading software, creating security and data protection settings, and writing technical and design documentation. Additionally, you will be responsible for writing effective APIs (REST & SOAP). To be successful in this role, you should have proven experience as a Java Developer or similar role as an individual contributor or development lead, familiarity with common stacks, strong knowledge and working experience of Core Java, Spring Boot, Rest APIs, and Spark API, knowledge of React framework and UI experience, knowledge of Junit, Mockito, or other frameworks, familiarity with GCP services, design/architecture, and security frameworks, experience with databases (e.g., Oracle, PostgreSQL, BigQuery), familiarity with developing on distributed application platforms like Hadoop with Spark, excellent communication and teamwork skills, organizational skills, an analytical mind, a degree in Computer Science, Statistics, or a relevant field, and experience working in Agile environments. Good to have skills include knowledge of JavaScript frameworks (e.g., Angular, React, Node.js) and UI/UX design, knowledge of Python, and knowledge of NoSQL databases like HBASE, MONGO. You should have 4-7 years of prior working experience in a global banking/insurance/financial organization. You will receive training and development to help you excel in your career, coaching and support from experts in your team, and a culture of continuous learning to aid progression. We strive for a culture in which we are empowered to excel together every day, acting responsibly, thinking commercially, taking initiative, and working collaboratively. Together we share and celebrate the successes of our people. We welcome applications from all people and promote a positive, fair, and inclusive work environment.,

Posted 1 week ago

Apply

4.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Big Data Lead with 7-12 years of experience, you will be responsible for software development using multiple computing languages. Your role will involve working on distributed data processing systems and applications, specifically in Business Intelligence/Data Warehouse (BIDW) programs. Additionally, you should have previous experience in development through testing, preferably on the J2EE stack. Your knowledge and understanding of best practices and concepts in Data Warehouse Applications will be crucial to your success in this role. You should possess a strong foundation in distributed systems and computing systems, with hands-on engineering skills. Hands-on experience with technologies such as Spark, Scala, Kafka, Hadoop, Hbase, Pig, and Hive is required. An understanding of NoSQL data stores, data modeling, and data management is essential for this position. Good interpersonal communication skills, along with excellent oral and written communication and analytical skills, are necessary for effective collaboration within the team. Experience with Data Lake implementation as an alternative to Data Warehouse is preferred. You should have hands-on experience with Data frames using Spark SQL and proficiency in SQL. A minimum of 2 end-to-end implementations in either Data Warehouse or Data Lake is required for this role as a Big Data Lead.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

kolkata, west bengal

On-site

You must have knowledge in Azure Datalake, Azure function, Azure Databricks, Azure Data Factory, and PostgreSQL. Working knowledge in Azure DevOps and Git flow would be an added advantage. Alternatively, you should have working knowledge in AWS Kinesis, AWS EMR, AWS Glue, AWS RDS, AWS Athena, and AWS RedShift. Demonstrable expertise in working with timeseries data is essential. Experience in delivering data engineering/data science projects in Industry 4.0 is an added advantage. Knowledge of Palantir is required. You must possess strong problem-solving skills with a focus on sustainable and reusable development. Proficiency in using statistical computer languages like Python/PySpark, Pandas, Numpy, seaborn/matplotlib is necessary. Knowledge in Streamlit.io is a plus. Familiarity with Scala, GoLang, Java, and big data tools such as Hadoop, Spark, Kafka is beneficial. Experience with relational databases like Microsoft SQL Server, MySQL, PostGreSQL, Oracle, and NoSQL databases including Hadoop, Cassandra, MongoDB is expected. Proficiency in data pipeline and workflow management tools like Azkaban, Luigi, Airflow is required. Experience in building and optimizing big data pipelines, architectures, and data sets is crucial. You should possess strong analytical skills related to working with unstructured datasets. Provide innovative solutions to data engineering problems, document technology choices, and integration patterns. Apply best practices for project delivery with clean code. Demonstrate innovation and proactiveness in meeting project requirements. Reporting to: Director- Intelligent Insights and Data Strategy Travel: Must be willing to be deployed at client locations worldwide for long and short terms, flexible for shorter durations within India and abroad.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies