Jobs
Interviews

16420 Spark Jobs - Page 36

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients acrossbanking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. JD for BI Consultant: A successful candidate needs to be technical, with architect experience, and have demonstrated ability to lead the administration of a complex BI infrastructure, primarily - but not limited to MicroStrategy, Tableau Server, and PowerBI ensuring customer satisfaction, superior performance, security adherence and supportability of the BI platform. Success in this role will require working effectively with a matrix organization within Data Platform and across other departments within XXXX. Examples include business intelligence for large scale data lakes, warehousing, cloud technology like Hadoop and Spark, and predictive analytics. Responsibilities Help application development to articulate/measure/maintain non-functional requirements for BI applications Hands on in install, configure, monitor, and tune BI Systemsprimarily MicroStrategy PowerBI, and Tableau server, in accordance with standard operational procedures and policies. Lead planning and execution of Business Intelligence tools upgrades Ensure performance and efficient capacity utilization by defining and creating BI system performance and utilization metrics Follow and advocate high security standards protecting XXXX data assets Participate in the overall strategy, standardization, analysis, design, and implementation of applications Promote security, innovation, transparency, ownership, respect, and inclusion Qualifications Hands-on administration and software development Experience with Analytics and BI Tools MicroStrategy 10 or similar Business Intelligence products (Tableau Server, Business Objects) Demonstrated experience in BI tools report and dashboard development and design Must be an expert of Linux environments, network communication and authentication (SSO, SAML, Kerberos, and LDAP) protocols Can write code with language in Shell and Python Require Strong SQL coding or NoSQL R Script expertise Understand workings of a production environment with change management, defect tracking and status reporting processes. Have strong problem diagnostic and debugging skills Agile development experience with experience with DevOps and CICD lifecycle Strong communication skills with ability to present complex ideas and document in a clear and concise way

Posted 5 days ago

Apply

4.0 - 8.0 years

10 - 14 Lacs

Pune

Work from Office

Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by . With our presence across 32 cities across globe, we support 100+ clients acrossbanking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Job TitleBig Data Engineer - Scala : Preferred Skills: ===- Strong skills in - Messaging Technologies like Apache Kafka or equivalent, Programming skill Scala, Spark with optimization techniques, Python Should able to write the query through Jupyter Notebook Orchestration tool like NiFi, AirFlow Design and implement intuitive, responsive UIs that allow issuers to better understand data and analytics Experience with SQL & Distributed Systems. Strong understanding of Cloud architecture. Ensure a high-quality code base by writing and reviewing performance, well-tested code Demonstrated experience building complex products. Knowledge of Splunk or other alerting and monitoring solutions. Fluent in the use of Git, Jenkins. Broad understanding of Software Engineering Concepts and Methodologies is required.

Posted 5 days ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

We are the computational advertising team in the AI & Research organization at Microsoft. We are looking for candidates with research and applied experience in machine learning related areas. Search advertising is a $100 billion market worldwide. Microsoft's Bing search engine supports over 30% of desktop search in the US, with similarly significant presence in many other countries. We are a team of applied scientists working on machine learning components in the whole sponsored search stack. Our team works on problems related to machine learning, deep learning, natural language processing, image understanding, optimization, information retrieval, among others. Our work entails building large-scale machine learning systems for user response modelling, ranking, and a number of other ML-driven business problems. The Principal Applied Scientist will design, implement, analyze, tune complex algorithms and ML systems and the supporting infrastructure for operating on large datasets. They will collaborate with top machine learning scientists and engineers in delivering direct business impact. We're looking for sound understanding and insight into productionizing machine learning models in large-scale systems, an ability to pick up new technical areas, as well as a commitment to developing, delivering, and supporting algorithms in production Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities You will master one or more subareas and gain expertise in a broad area of research. You’ll use your expertise to identify appropriate techniques for examining a problem and foster an audience for the product based on your understanding of the industry. You will review business and product requirements, identify problems, and develop strategies to resolve team problems. You’ll develop an understanding of tools and methods being used in the community and hold team members accountable to support business impact. You will share industry and competitive insights and build capabilities through coaching. You’ll provide guidance around best practices and collaborate with the academic community to develop the recruiting pipeline. You will document scientific work to ensure processes are captured. You’ll also contribute to ethics and privacy policies related to research processes by providing updates and suggestions around best practices. Qualifications Required Qualifications: Bachelor's Degree in Statistics, Econometrics, Computer Science, Electrical or Computer Engineering, or related field AND 6+ years related experience (e.g., statistics, predictive analytics, research) OR Master's Degree in Statistics, Econometrics, Computer Science, Electrical or Computer Engineering, or related field AND 4+ years related experience (e.g., statistics, predictive analytics, research) OR Doctorate in Statistics, Econometrics, Computer Science, Electrical or Computer Engineering, or related field AND 3+ years related experience (e.g., statistics, predictive analytics, research) OR equivalent experience. Solid understanding of probability, statistics, machine learning, data science, A/B testing & analysis of ML models, and optimizing models for accuracy Experience with Hadoop, Spark, or other distributed computing systems for large-scale training & prediction with ML models. Experience with end-to-end system design: data analysis, feature engineering, technique selection & implementation, debugging, and maintenance in production. Experience implementing machine learning algorithms or research papers from scratch Solid experience with TensorFlow/PyTorch and deep learning models is a prerequisite. Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. #MicrosoftAI Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 5 days ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Established in 2004, OLIVER is the world’s first and only specialist in designing, building, and running bespoke in-house agencies and marketing ecosystems for brands. We partner with over 300 clients in 40+ countries and counting. Our unique model drives creativity and efficiency, allowing us to deliver tailored solutions that resonate deeply with audiences. As a part of The Brandtech Group , we're at the forefront of leveraging cutting-edge AI technology to revolutionise how we create and deliver work. Our AI solutions enhance efficiency, spark creativity, and drive insightful decision-making, empowering our teams to produce innovative and impactful results. Role: YouTube Marketing Experiences - Manager Location: Bangalore, India What you will be doing: Creator Meetups and Events A key aspect of this role is organizing and managing creator-focused events: Strategic Planning: Conceptualization: Developing the concept, theme, and objectives for creator meetups, workshops, and other events, aligning them with overall creator marketing goals. Agenda Development: Creating engaging and informative event agendas that provide value to creators, such as educational sessions, networking opportunities, and exclusive announcements. Logistics and Execution: Venue Selection: Identifying and securing appropriate venues for events, considering factors such as size, location, and amenities. Vendor Management: Coordinating with vendors and suppliers for catering, technical equipment, and other event needs. Budget Management: Developing and managing event budgets, ensuring cost-effectiveness and maximizing ROI. Event Coordination: Overseeing all logistical aspects of event execution, including registration, setup, on-site management, and post-event follow-up. Creator Engagement: Invitation and Communication: Inviting creators to participate in events, managing RSVPs, and providing clear and timely communication before, during, and after the event. Networking Facilitation: Creating opportunities for creators to connect, network, and collaborate with each other, fostering a sense of community. Feedback Collection: Gathering feedback from creators after events to assess their satisfaction, identify areas for improvement, and inform future event planning. What you need to be great in this role: BA/BS degree or equivalent experience 8+ years of relevant experience in social media marketing and influencer marketing across agencies or client side Excellent written and verbal communication skills Strong understanding of social media best practices and campaigns in India and around the world, and the best Brands and Agencies. Experience in social analytics, knowledge of social media monitoring and listening tools Strong strategic thinking and comfortable with data backed decision making Demonstrated strong performance in prior roles, with independence, strong sense of ownership and ability to form own point of view and recommendations. Proven team player, excellent interpersonal skills. Distinctive problem solving and analysis skills and innovative thinking for campaign roll out. Passion for and inquisitive about AI and new technologies Understanding and knowledge of AI tools is beneficial, but ability to learn and digest benefits and features of AI tools is critical. 50% travel required as part of this role Req ID: 13809 Our values shape everything we do: Be Ambitious to succeed Be Imaginative to push the boundaries of what’s possible Be Inspirational to do groundbreaking work Be always learning and listening to understand Be Results-focused to exceed expectations Be actively pro-inclusive and anti-racist across our community, clients and creations OLIVER, a part of the Brandtech Group, is an equal opportunity employer committed to creating an inclusive working environment where all employees are encouraged to reach their full potential, and individual differences are valued and respected. All applicants shall be considered for employment without regard to race, ethnicity, religion, gender, sexual orientation, gender identity, age, neurodivergence, disability status, or any other characteristic protected by local laws. OLIVER has set ambitious environmental goals around sustainability, with science-based emissions reduction targets. Collectively, we work towards our mission, embedding sustainability into every department and through every stage of the project lifecycle.

Posted 5 days ago

Apply

5.0 - 7.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BCom Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Technical and Professional Requirements: PythonPySparkETLData PipelineBig DataAWSGCPAzureData WarehousingSparkHadoop Preferred Skills: Technology-Big Data-Big Data - ALL

Posted 5 days ago

Apply

3.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional Requirements: Primary skills:Technology-Cloud Platform-Azure Development & Solution Architecting Preferred Skills: Technology-Cloud Platform-Azure Development & Solution Architecting Generic Skills: Technology-Cloud Integration-Azure Data Factory (ADF)

Posted 5 days ago

Apply

3.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Educational Requirements MCA,MSc,Bachelor of Engineering,BSc,Bachelor of Business Administration and Bachelor of Legislative Law (BBA LLB) Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skillsTechnology-Cloud Platform-Azure Analytics Services-Azure Data Lake Preferred Skills: Technology-Cloud Platform-Azure Development & Solution Architecting

Posted 5 days ago

Apply

7.0 - 9.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to provide best fit architectural solutions for one or more projects. You would also provide technology consultation and assist in defining scope and sizing of work You would implement solutions, create technology differentiation and leverage partner technologies. Additionally, you would participate in competency development with the objective of ensuring the best-fit and high quality technical solutions. You would be a key contributor in creating thought leadership within the area of technology specialization and in compliance with guidelines, policies and norms of Infosys.If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of architectural design patterns, performance tuning, database and functional designs Hands-on experience in Service Oriented Architecture Ability to lead solution development and delivery for the design solutions Experience in designing high level and low level documents is a plus Good understanding of SDLC is a pre-requisite Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: Primary skills:Technology-Data on Cloud-DataStore-Snowflake Preferred Skills: Technology-Data on Cloud-DataStore-Snowflake

Posted 5 days ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BCom Service Line Data & Analytics Unit Responsibilities Roles & Responsibilities: Understand the requirements from the business and translate it into an appropriate technical requirement. Responsible for successful delivery of MLOps solutions and services in client consulting environments; Define key business problems to be solved; formulate high level solution approaches and identify data to solve those problems, develop, analyze/draw conclusions and present to client. Assist clients with operationalization metrics to track performance of ML Models Help team with ML Pipelines from creation to execution Guide team to debug on issues with pipeline failures Understand and take requirements on Operationalization of ML Models from Data Scientist Engage with Business / Stakeholders with status update on progress of development and issue fix Setup Standards related to Coding, Pipelines and Documentation Research on new topics, services and enhancements in Cloud Technologies Additional Responsibilities: EEO/ :Infosys is a global leader in next-generation digital services and consulting. We enable clients in more than 50 countries to navigate their digital transformation. With over four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.Infosys provides equal employment opportunities to applicants and employees without regard to race; color; sex; gender identity; sexual orientation; religious practices and observances; national origin; pregnancy, childbirth, or related medical conditions; status as a protected veteran or spouse/family member of a protected veteran; or disability. Technical and Professional Requirements: Preferred Qualifications: Experienced in Agile way of working, manage team effort and track through JIRA High Impact client communication Domain experience in Retail, CPG and Logistics Experience in Test Driven Development and experience in using Pytest frameworks, git version control, Rest APIsThe job may entail extensive travel. The job may also entail sitting as well as working at a computer for extended periods of time. Candidates should be able to effectively communicate by telephone, email, and face to face. Preferred Skills: Technology-Machine learning-data science

Posted 5 days ago

Apply

8.0 - 13.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BSc Service Line Data & Analytics Unit Responsibilities Consulting Skills: oHypothesis-driven problem solvingoGo-to-market pricing and revenue growth executionoAdvisory, Presentation, Data StorytellingoProject Leadership and Execution Additional Responsibilities: Typical Work Environment Collaborative work with cross-functional teams across sales, marketing, and product development Stakeholder Management, Team Handling Fast-paced environment with a focus on delivering timely insights to support business decisions Excellent problem-solving skills and ability to address complex technical challenges. Effective communication skills to collaborate with cross-functional teams and stakeholders. Potential to work on multiple projects simultaneously, prioritizing tasks based on business impact Qualification: Degree in Data Science, Computer Science with data science specialization Master’s in business administration and Analytics preferred Technical and Professional Requirements: Technical Skills: oProficiency in programming languages like Python and R for data manipulation and analysis oExpertise in machine learning algorithms and statistical modeling techniques oFamiliarity with data warehousing and data pipelines oExperience with data visualization tools like Tableau or Power BI oExperience in Cloud platforms (e.g., ADF, Data bricks, Azure) and their AI services. Preferred Skills: Technology-Big Data-Text Analytics

Posted 5 days ago

Apply

9.0 - 11.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to lead the engagement effort of providing high-quality and value-adding consulting solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. You will review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. You will identify change Management requirements and propose a structured approach to client for managing the change using multiple communication mechanisms. You will also coach and create a vision for the team, provide subject matter training for your focus areas, motivate and inspire team members through effective and timely feedback and recognition for high performance. You would be a key contributor in unit-level and organizational initiatives with an objective of providing high-quality, value-adding consulting solutions to customers adhering to the guidelines and processes of the organization. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Good knowledge on software configuration management systems Strong business acumen, strategy and cross-industry thought leadership Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Two or three industry domain knowledge Understanding of the financial processes for various types of projects and the various pricing models available Client Interfacing skills Knowledge of SDLC and agile methodologies Project and Team management Technical and Professional Requirements: Primary skills:Technology-Data on Cloud-DataStore-Snowflake Preferred Skills: Technology-Data on Cloud-DataStore-Snowflake

Posted 5 days ago

Apply

2.0 - 5.0 years

13 - 17 Lacs

Bengaluru

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 5 days ago

Apply

3.0 - 6.0 years

10 - 14 Lacs

Gurugram

Work from Office

As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architectureto design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management toolsDataflow, Pub Sub, Hadoop, spark-streaming Version control systemGIT & Preferable knowledge of Infrastructure as CodeTerraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.

Posted 5 days ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Chennai

Work from Office

Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices, Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices- Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)- Spark Good to have Python. Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands, Experience in Concurrent design and multi-threading. Preferred technical and professional experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python.

Posted 5 days ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

Kochi

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops- Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 5 days ago

Apply

2.0 - 6.0 years

12 - 16 Lacs

Kochi

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in Big Data Technology like Hadoop, Apache Spark, Hive. Practical experience in Core Java (1.8 preferred) /Python/Scala. Having experience in AWS cloud services including S3, Redshift, EMR etc,Strong expertise in RDBMS and SQL. Good experience in Linux and shell scripting. Experience in Data Pipeline using Apache Airflow Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 5 days ago

Apply

5.0 - 9.0 years

13 - 17 Lacs

Mumbai

Work from Office

The role creates and maintains strong, trusted client relationships at all levels within the client's business. They design solution recommendations for their clients applying their technical skills, designing centralized or distributed systems that both address the user's requirements and perform efficiently and effectively for banks. He should have an understanding of banking industry and expertise leveraging IBM technologies, architectures, integrated solutions and offerings, focusing on the elements required to manage all aspects of data and information from business requirements to logical and physical design. They should be able to lead information management lifecycle from acquisition, through cleanse, transform, classify and storage to presentation, distribution, security, privacy and archiving. Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise Data Architect Banking Data architect with deep knowledge on Cassandra, Pulsar, Debezium etc as well as Banking domain knowledge. Preferred technical and professional experience Data architect with deep knowledge on Cassandra, Pulsar, Debezium etc as well as Banking domain knowledge.

Posted 5 days ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Mumbai

Work from Office

The ability to be a team player The ability and skill to train other people in procedural and technical topics Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Able to write complex SQL queries ; Having experience in Azure Databricks Preferred technical and professional experience Excellent communication and stakeholder management skills

Posted 5 days ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Kochi

Work from Office

Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc

Posted 5 days ago

Apply

2.0 - 6.0 years

12 - 16 Lacs

Kochi

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Developed the Pysprk code for AWS Glue jobs and for EMR. Worked on scalable distributed data system using Hadoop ecosystem in AWS EMR, MapR distribution.. Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Hadoop streaming Jobs using python for integrating python API supported applications.. Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.. Re - write some Hive queries to Spark SQL to reduce the overall batch time Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 5 days ago

Apply

2.0 - 5.0 years

6 - 10 Lacs

Pune

Work from Office

Follow SBI / client calendar (Work on weekdays + 1,3 & 5th Saturday) Manage 24*7 shift work (ensure that support is made available on need basis) Assisting clients in selection, implementation, and support of package Make strategic recommendations and leverage business knowledge to drive solutions for clients and their management Run or support workshops, meetings, and stakeholder interviews Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 10+ Yrs of exp in Data Domain. Mandatory Skills for screening ETL Datastage, SQL Query Should have 2 to 3 years of exp. in Analysis, Design and Development Good to have (Not Mandatory) CDC, DB2, TWS Work from client location only. B.Tech with 60% is must. If not one of the below certifications are required. Certification in DB2, CDC, ETL DataStage, Oracle is preferred Manages and maintains interfaces built with DataStage, IBM's WebSphere Data Integration Suite software, between operational and external environments to the business intelligence environment Preferred technical and professional experience Experience in insurance or financial services good to have

Posted 5 days ago

Apply

2.0 - 6.0 years

12 - 16 Lacs

Bengaluru

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs.Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Developed the Pysprk code for AWS Glue jobs and for EMR. Worked on scalable distributed data system using Hadoop ecosystem in AWS EMR, MapR distribution.. Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Hadoop streaming Jobs using python for integrating python API supported applications.. Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.. Re- write some Hive queries to Spark SQL to reduce the overall batch time Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 5 days ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

Kochi

Work from Office

Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. 7+ Yrs total experience in Data Engineering projects & 4+ years of relevant experience on Azure technology services and Python Azure Azure data factory, ADLS- Azure data lake store, Azure data bricks, Mandatory Programming languages Py-Spark, PL/SQL, Spark SQL Database SQL DB Experience with AzureADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with object-oriented/object function scripting languagesPython, SQL, Scala, Spark-SQL etc. Data Warehousing experience with strong domain Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications Preferred technical and professional experience Experience with AzureADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with object-oriented/object function scripting languagesPython, SQL, Scala, Spark-SQL etc

Posted 5 days ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

Bengaluru

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp 3-6 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 5 days ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Candidates must have experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems 10 - 15 years of experience in data engineering and architecting data platforms 5 – 8 years’ experience in architecting and implementing Data Platforms Azure Cloud Platform. 5 – 8 years’ experience in architecting and implementing Data Platforms on-prem (Hadoop or DW appliance) Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow. Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc Candidates should have experience in delivering both business decision support systems (reporting, analytics) and data science domains / use cases

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies