Jobs
Interviews

16161 Spark Jobs - Page 26

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Educational Requirements MCA,MSc,Bachelor of Engineering,BSc,Bachelor of Business Administration and Bachelor of Legislative Law (BBA LLB) Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skillsTechnology-Cloud Platform-Azure Analytics Services-Azure Data Lake Preferred Skills: Technology-Cloud Platform-Azure Development & Solution Architecting

Posted 4 days ago

Apply

7.0 - 9.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to provide best fit architectural solutions for one or more projects. You would also provide technology consultation and assist in defining scope and sizing of work You would implement solutions, create technology differentiation and leverage partner technologies. Additionally, you would participate in competency development with the objective of ensuring the best-fit and high quality technical solutions. You would be a key contributor in creating thought leadership within the area of technology specialization and in compliance with guidelines, policies and norms of Infosys.If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of architectural design patterns, performance tuning, database and functional designs Hands-on experience in Service Oriented Architecture Ability to lead solution development and delivery for the design solutions Experience in designing high level and low level documents is a plus Good understanding of SDLC is a pre-requisite Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: Primary skills:Technology-Data on Cloud-DataStore-Snowflake Preferred Skills: Technology-Data on Cloud-DataStore-Snowflake

Posted 4 days ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BCom Service Line Data & Analytics Unit Responsibilities Roles & Responsibilities: Understand the requirements from the business and translate it into an appropriate technical requirement. Responsible for successful delivery of MLOps solutions and services in client consulting environments; Define key business problems to be solved; formulate high level solution approaches and identify data to solve those problems, develop, analyze/draw conclusions and present to client. Assist clients with operationalization metrics to track performance of ML Models Help team with ML Pipelines from creation to execution Guide team to debug on issues with pipeline failures Understand and take requirements on Operationalization of ML Models from Data Scientist Engage with Business / Stakeholders with status update on progress of development and issue fix Setup Standards related to Coding, Pipelines and Documentation Research on new topics, services and enhancements in Cloud Technologies Additional Responsibilities: EEO/ :Infosys is a global leader in next-generation digital services and consulting. We enable clients in more than 50 countries to navigate their digital transformation. With over four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.Infosys provides equal employment opportunities to applicants and employees without regard to race; color; sex; gender identity; sexual orientation; religious practices and observances; national origin; pregnancy, childbirth, or related medical conditions; status as a protected veteran or spouse/family member of a protected veteran; or disability. Technical and Professional Requirements: Preferred Qualifications: Experienced in Agile way of working, manage team effort and track through JIRA High Impact client communication Domain experience in Retail, CPG and Logistics Experience in Test Driven Development and experience in using Pytest frameworks, git version control, Rest APIsThe job may entail extensive travel. The job may also entail sitting as well as working at a computer for extended periods of time. Candidates should be able to effectively communicate by telephone, email, and face to face. Preferred Skills: Technology-Machine learning-data science

Posted 4 days ago

Apply

8.0 - 13.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BSc Service Line Data & Analytics Unit Responsibilities Consulting Skills: oHypothesis-driven problem solvingoGo-to-market pricing and revenue growth executionoAdvisory, Presentation, Data StorytellingoProject Leadership and Execution Additional Responsibilities: Typical Work Environment Collaborative work with cross-functional teams across sales, marketing, and product development Stakeholder Management, Team Handling Fast-paced environment with a focus on delivering timely insights to support business decisions Excellent problem-solving skills and ability to address complex technical challenges. Effective communication skills to collaborate with cross-functional teams and stakeholders. Potential to work on multiple projects simultaneously, prioritizing tasks based on business impact Qualification: Degree in Data Science, Computer Science with data science specialization Master’s in business administration and Analytics preferred Technical and Professional Requirements: Technical Skills: oProficiency in programming languages like Python and R for data manipulation and analysis oExpertise in machine learning algorithms and statistical modeling techniques oFamiliarity with data warehousing and data pipelines oExperience with data visualization tools like Tableau or Power BI oExperience in Cloud platforms (e.g., ADF, Data bricks, Azure) and their AI services. Preferred Skills: Technology-Big Data-Text Analytics

Posted 4 days ago

Apply

9.0 - 11.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to lead the engagement effort of providing high-quality and value-adding consulting solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. You will review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. You will identify change Management requirements and propose a structured approach to client for managing the change using multiple communication mechanisms. You will also coach and create a vision for the team, provide subject matter training for your focus areas, motivate and inspire team members through effective and timely feedback and recognition for high performance. You would be a key contributor in unit-level and organizational initiatives with an objective of providing high-quality, value-adding consulting solutions to customers adhering to the guidelines and processes of the organization. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Good knowledge on software configuration management systems Strong business acumen, strategy and cross-industry thought leadership Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Two or three industry domain knowledge Understanding of the financial processes for various types of projects and the various pricing models available Client Interfacing skills Knowledge of SDLC and agile methodologies Project and Team management Technical and Professional Requirements: Primary skills:Technology-Data on Cloud-DataStore-Snowflake Preferred Skills: Technology-Data on Cloud-DataStore-Snowflake

Posted 4 days ago

Apply

2.0 - 5.0 years

13 - 17 Lacs

Bengaluru

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 4 days ago

Apply

3.0 - 6.0 years

10 - 14 Lacs

Gurugram

Work from Office

As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architectureto design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management toolsDataflow, Pub Sub, Hadoop, spark-streaming Version control systemGIT & Preferable knowledge of Infrastructure as CodeTerraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.

Posted 4 days ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Chennai

Work from Office

Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices, Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices- Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)- Spark Good to have Python. Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands, Experience in Concurrent design and multi-threading. Preferred technical and professional experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python.

Posted 4 days ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

Kochi

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops- Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 4 days ago

Apply

2.0 - 6.0 years

12 - 16 Lacs

Kochi

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in Big Data Technology like Hadoop, Apache Spark, Hive. Practical experience in Core Java (1.8 preferred) /Python/Scala. Having experience in AWS cloud services including S3, Redshift, EMR etc,Strong expertise in RDBMS and SQL. Good experience in Linux and shell scripting. Experience in Data Pipeline using Apache Airflow Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 4 days ago

Apply

5.0 - 9.0 years

13 - 17 Lacs

Mumbai

Work from Office

The role creates and maintains strong, trusted client relationships at all levels within the client's business. They design solution recommendations for their clients applying their technical skills, designing centralized or distributed systems that both address the user's requirements and perform efficiently and effectively for banks. He should have an understanding of banking industry and expertise leveraging IBM technologies, architectures, integrated solutions and offerings, focusing on the elements required to manage all aspects of data and information from business requirements to logical and physical design. They should be able to lead information management lifecycle from acquisition, through cleanse, transform, classify and storage to presentation, distribution, security, privacy and archiving. Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise Data Architect Banking Data architect with deep knowledge on Cassandra, Pulsar, Debezium etc as well as Banking domain knowledge. Preferred technical and professional experience Data architect with deep knowledge on Cassandra, Pulsar, Debezium etc as well as Banking domain knowledge.

Posted 4 days ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Mumbai

Work from Office

The ability to be a team player The ability and skill to train other people in procedural and technical topics Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Able to write complex SQL queries ; Having experience in Azure Databricks Preferred technical and professional experience Excellent communication and stakeholder management skills

Posted 4 days ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Kochi

Work from Office

Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc

Posted 4 days ago

Apply

2.0 - 6.0 years

12 - 16 Lacs

Kochi

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Developed the Pysprk code for AWS Glue jobs and for EMR. Worked on scalable distributed data system using Hadoop ecosystem in AWS EMR, MapR distribution.. Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Hadoop streaming Jobs using python for integrating python API supported applications.. Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.. Re - write some Hive queries to Spark SQL to reduce the overall batch time Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 4 days ago

Apply

2.0 - 5.0 years

6 - 10 Lacs

Pune

Work from Office

Follow SBI / client calendar (Work on weekdays + 1,3 & 5th Saturday) Manage 24*7 shift work (ensure that support is made available on need basis) Assisting clients in selection, implementation, and support of package Make strategic recommendations and leverage business knowledge to drive solutions for clients and their management Run or support workshops, meetings, and stakeholder interviews Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 10+ Yrs of exp in Data Domain. Mandatory Skills for screening ETL Datastage, SQL Query Should have 2 to 3 years of exp. in Analysis, Design and Development Good to have (Not Mandatory) CDC, DB2, TWS Work from client location only. B.Tech with 60% is must. If not one of the below certifications are required. Certification in DB2, CDC, ETL DataStage, Oracle is preferred Manages and maintains interfaces built with DataStage, IBM's WebSphere Data Integration Suite software, between operational and external environments to the business intelligence environment Preferred technical and professional experience Experience in insurance or financial services good to have

Posted 4 days ago

Apply

2.0 - 6.0 years

12 - 16 Lacs

Bengaluru

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs.Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Developed the Pysprk code for AWS Glue jobs and for EMR. Worked on scalable distributed data system using Hadoop ecosystem in AWS EMR, MapR distribution.. Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Hadoop streaming Jobs using python for integrating python API supported applications.. Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.. Re- write some Hive queries to Spark SQL to reduce the overall batch time Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 4 days ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

Kochi

Work from Office

Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. 7+ Yrs total experience in Data Engineering projects & 4+ years of relevant experience on Azure technology services and Python Azure Azure data factory, ADLS- Azure data lake store, Azure data bricks, Mandatory Programming languages Py-Spark, PL/SQL, Spark SQL Database SQL DB Experience with AzureADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with object-oriented/object function scripting languagesPython, SQL, Scala, Spark-SQL etc. Data Warehousing experience with strong domain Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications Preferred technical and professional experience Experience with AzureADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with object-oriented/object function scripting languagesPython, SQL, Scala, Spark-SQL etc

Posted 4 days ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

Bengaluru

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp 3-6 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 4 days ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Candidates must have experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems 10 - 15 years of experience in data engineering and architecting data platforms 5 – 8 years’ experience in architecting and implementing Data Platforms Azure Cloud Platform. 5 – 8 years’ experience in architecting and implementing Data Platforms on-prem (Hadoop or DW appliance) Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow. Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc Candidates should have experience in delivering both business decision support systems (reporting, analytics) and data science domains / use cases

Posted 4 days ago

Apply

1.0 - 4.0 years

8 - 12 Lacs

Bengaluru

Work from Office

As a member of Data & AI Upgrade SWAT team, you will assist clients with upgrading their deployment of Cloud Pak for Data by working in partnership with IBM CSM (Client Success Manager) and, where applicable, Expert Labs, IBM Consulting, or other teams. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise The ideal candidate should have strong technical skills on Cloud Pak for Data which is a Data and AI platform on top of Red Hat OpenShift Container Platform and a passion to work challenging customer projects.Following are the skill requirements: Should have 5 - 7 Years working experience Linux and Red Hat OpenShift Container Platform. Expertise on Cloud technologies, such as IBM Cloud, AWS or Azure. Familiar with Python, JavaScript or other programming languages. Strong analytical and problem-solving skills. Good communication and project management skills in customer projects. Strong learning skills with new technologies and tools. Fluent level of English. Preferred technical and professional experience Knowledge and experience with IBM Data and AI technologies and products would be preferred. Experience with install or upgrade of Cloud Pak for Data would be a plus. Experience with IBM Software Support would be a plus.

Posted 4 days ago

Apply

175.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express Join Team Amex and let's lead the way together. Team Overview: Global Credit & Model Risk Oversight, Transaction Monitoring & GRC Capabilities (CMRC) provides independent challenge and ensures that significant Credit and Model risks are properly evaluated and monitored, and Anti-Money Laundering (AML) risks are mitigated through the transaction monitoring program. In addition, CMRC hosts the central product organization responsible for the ongoing maintenance and modernization of GRC platforms and capabilities. How will you make an impact in this role? The AML Data Capabilities team was established with a mission to own and govern data across products – raw data, derivations, organized views to cater for analytics and production use cases and to manage the end-to-end data quality. This team comprises of risk data experts with deep SME knowledge of risk data, systems and processes covering all aspects of customer life cycle. Our mission is to build and support Anti-Money Laundering Transaction Monitoring data and rule needs in collaboration with Strategy and technology partners with focus on our core tenets of Timeliness , Quality and process efficiency. Responsibilities include: · Develop and Maintain Organized Data Layers to cater for both Production use cases and Analytics for Transaction Monitoring of Anti-Money Laundering rules. · Manage end to end Big Data Integration processes for building key variables from disparate source systems with 100% accuracy and 100% on time delivery · Partner closely with Strategy and Modeling teams in building incremental intelligence, with strong emphasis on maintaining globalization and standardization of attribute calculations across portfolios. · Partner with Tech teams in designing and building next generation data quality controls. · Drive automation initiatives within existing processes and fully optimize delivery effort and processing time · Effectively manage relationship with stakeholders across multiple geographies · Contribute into evaluating and/or developing right tools, common components, and capabilities · Follow industry best agile practices to deliver on key priorities Implementation of defined rules on Lucy platform in order to identify the AML alerts. · Ensuring process and actions are logged and support regulatory reporting, documenting the analysis and the rule build in form of qualitative document for relevant stakeholders. Minimum Qualifications · Academic Background: Bachelor’s degree with up to 2 year of relevant work experience · Strong Hive, SQL skills, knowledge of Big data and related technologies · Hands on experience on Hadoop & Shell Scripting is a plus · Understanding of Data Architecture & Data Engineering concepts · Strong verbal and written communication skills, with the ability to cater to versatile technical and non-technical audience · Willingness to Collaborate with Cross-Functional teams to drive validation and project execution · Good to have skills - Python / Py-Spark · Excellent Analytical & critical thinking with attention to detail · Excellent planning and organizations skills including ability to manage inter-dependencies and execute under stringent deadlines · Exceptional drive and commitment; ability to work and thrive in in fast changing, results driven environment; and proven ability in handling competing priorities Behavioral Skills/Capabilities: Enterprise Leadership Behaviors Set the Agenda: Ø Ability to apply thought leadership and come up with ideas Ø Take complete perspective into picture while designing solutions Ø Use market best practices to design solutions Bring Others with You: Ø Collaborate with multiple stakeholders and other scrum team to deliver on promise Ø Learn from peers and leaders Ø Coach and help peers Do It the Right Way: Ø Communicate Effectively Ø Be candid and clear in communications Ø Make Decisions Quickly & Effectively Ø Live the company culture and values We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 4 days ago

Apply

2.0 - 5.0 years

14 - 17 Lacs

Hyderabad

Work from Office

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark)In-depth knowledge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data TechnologiesFamiliarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in PythonExpertise in Python programming with a focus on data processing and manipulation. Data Processing FrameworksKnowledge of data processing libraries such as Pandas, NumPy. SQL ProficiencyExperience writing optimized SQL queries for large-scale data analysis and transformation. Cloud PlatformsExperience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 4 days ago

Apply

2.0 - 5.0 years

14 - 17 Lacs

Mysuru

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 4 days ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Advanced Programming Skills in Python,Scala,GoStrong expertise in developing and maintaining microservices in Go (or other similar languages), with the ability to lead and mentor others in this area. Extensive exposure in developing Big Data Applications ,Data Engineering ,ETL and Data Analytics . Cloud ExpertiseIn-depth knowledge of IBM Cloud or similar cloud platforms, with a proven track record of deploying and managing cloud-native applications. Leadership and CollaborationAbility to lead cross-functional teams, work closely with product owners, and drive platform enhancements while mentoring junior team members. Security and ComplianceStrong understanding of security best practices and compliance standards, with experience ensuring that platforms meet or exceed these requirements. Analytical and Problem-Solving Skills: Excellent problem-solving abilities with a proven track record of resolving complex issues in a multi-tenant environment. Required education Bachelor's Degree Required technical and professional expertise 4-7 years' experience primarily in using Apache Spark, Kafka and SQL preferably in Data Engineering projects with a strong TDD approach. Advanced Programming Skills in languages like Python ,Java , Scala with proficiency in SQL Extensive exposure in developing Big Data Applications, Data Engineering, ETL ETL tools and Data Analytics. Exposure in Data Modelling, Data Quality and Data Governance. Extensive exposure in Creating and maintaining Data pipelines - workflows to move data from various sources into data warehouses or data lakes. Cloud ExpertiseIn-depth knowledge of IBM Cloud or similar cloud platforms, with a proven track record of developing, deploying and managing cloud-native applications. Good to have Front-End Development experienceReact, Carbon, and Node for managing and improving user-facing portals. Leadership and CollaborationAbility to lead cross-functional teams, work closely with product owners, and drive platform enhancements while mentoring junior team members. Security and ComplianceStrong understanding of security best practices and compliance standards, with experience ensuring that platforms meet or exceed these requirements. Analytical and Problem-Solving Skills: Excellent problem-solving abilities with a proven track record of resolving complex issues in a multi-tenant environment. Preferred technical and professional experience Hands on experience with Data Analysis & Querying using SQLs and considerable exposure to ETL processes. Expertise in developing Cloud applications with High Volume Data processing. Worked on building scalable Microservices components using various API development frameworks.

Posted 4 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Newton School of Technology is on a mission to transform technology education and bridge the employability gap. As India’s first impact university, we are committed to revolutionizing learning, empowering students, and shaping the future of the tech industry. Backed by renowned professionals and industry leaders, we aim to solve the employability challenge and create a lasting impact on society. We are currently looking for a Senior Data Scientist+Instructor to join our Computer Science Department. This is a full-time academic role focused on data mining, analytics, and teaching/mentoring students in core data science and engineering topics. Key Responsibilities Develop and deliver comprehensive and engaging lectures for the undergraduate "Data Mining", “Big Data”, and “Data Analytics” courses, covering the full syllabus from foundational concepts to advanced techniques. Instruct students on the complete data lifecycle, including data preprocessing, cleaning, transformation, and feature engineering. Teach the theory, implementation, and evaluation of a wide range of algorithms for Classification, Association rules mining, Clustering, and Anomaly Detection. Design and facilitate practical lab sessions and assignments that provide students with hands-on experience using modern data tools and software. Develop and grade assessments, including assignments, projects, and examinations, that effectively measure the Course Learning Objectives (CLOs). Mentor and guide students on projects, encouraging them to work with real-world or benchmark datasets (e.g., from Kaggle). Stay current with the latest advancements, research, and industry trends in data engineering and machine learning to ensure the curriculum remains relevant and cutting-edge. Contribute to the academic and research environment of the department and the university. Required Qualifications A Ph.D. (or a Master's degree with significant, relevant industry experience) in Computer Science, Data Science, Artificial Intelligence, or a closely related field. Demonstrable 3-10 years of expertise in the core concepts of data engineering and machine learning as outlined in the syllabus. Strong practical proficiency in Python and its data science ecosystem, specifically Scikit-learn, Pandas, NumPy, and visualization libraries (e.g., Matplotlib, Seaborn). Proven experience in teaching, preferably at the undergraduate level, with an ability to make complex topics accessible and engaging. Excellent communication and interpersonal skills. Preferred Qualifications A strong record of academic publications in reputable data mining, machine learning, or AI conferences/journals. Prior industry experience as a Data Scientist, Big Data Engineer, Machine Learning Engineer, or in a similar role. Experience with big data technologies (e.g., Spark, Hadoop) and/or deep learning frameworks (e.g., TensorFlow, PyTorch). Experience in mentoring student teams for data science competitions or hackathons. Perks & Benefits Competitive salary packages aligned with industry standards. Access to state-of-the-art labs and classroom facilities. To know more about us, feel free to explore our website: Newton School of Technology. We look forward to the possibility of having you join our academic team and help shape the future of tech education!

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies