Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 8.0 years
19 - 23 Lacs
Hyderabad
Work from Office
In this role you will be joining the Enterprise Data Solutions team, within the Digital & Information Technology organization. Driven by a passion for operational excellence and innovation, we partner with all functional groups to provide the expertise and technologies which will enable the company to digitalize, simplify, and scale for the future. We are seeking an experienced Sr. Data Engineer to join our Enterprise Data Solutions team. The ideal candidate will have a strong background in data engineering, data analysis, business intelligence, and data management. This role will be responsible for the ingestion, processing, and storage of data in our Azure Databricks Data Lake and SQL Server data warehouses. OVERVIEW: The Enterprise Data Solutions team provides Skillsoft with the data backbone needed to seamlessly connect systems and enable data-driven business insights through democratized and analytics-ready data sets. Our mission is to: Deliver analytics-ready data sets that enhance business insights, drive decision making, and foster a culture of data-driven innovation. Set a gold standard for process, collaboration, and communication. OPPORTUNITY HIGHLIGHTS: Lead the identification of business data requirements, create data models and design processes that align to the business logic and regularly communicate with business stakeholders to ensure delivery meets business needs. Design ETL processes, develop source-to-target mappings/integration workflows and manage load processes to support regular and ad hoc activities considering the needs of down-stream systems, functions and visualizations. Work with the latest open-source tools, libraries, platforms and languages to build data products enabling other analysts to explore and interact with large and complex data sets Build robust systems and reusable code modules to solve problems across the team and organization with an eye on the long-term maintenance and support of the application Perform routine testing of own and others’ work to guarantee accurate, complete processes that support business needs. Awareness and compliance with all organizational development standards, industry best practices and business, security, privacy, and retention requirements. Routinely monitor performance, diagnose and implement tuning/optimization strategies to guarantee a highly efficient data structure. Collaborate with other engineers through active participation in code reviews and challenge the team to deliver with precision, consistency and speed. Document data flows and technical designs to ensure compliance with organization, business and security best practices. Regularly monitor timelines and workload. Ensure delivery promises are met or exceeded. Ability and willingness to support the BI mission through learning new technologies and supporting other projects as needed. Provides code reviews and technical guidance to the team. Collaborate closely with the SA and TPO and get the requirements and develop the enterprise solutions SKILLS & QUALIFICATIONS: Bachelor’s degree in quantitative field – engineering, finance, data science, statistics, economics, or other quantitative. 5+ years of experience in Data Engineering/Data Management space and working with enterprise level production data warehouses. 5+ years of experience in working with Azure Databricks 5+ years experience in SQL and PySpark Ability to work in an Agile methodology environment. Experience and interest in cloud migration/journey to the cloud for data platforms and landscape Strong business acumen, analytical skills, and technical abilities Practical problem-solving skills and ability to move complex projects forward.
Posted 3 days ago
12.0 years
0 Lacs
India
On-site
Job Title: Spark Scala Architect / Bigdata Experience: 12 to 16 Years Location: Bangalore, Pune, Hyderabad, Mumbai Employment Type: Full-Time Interview Mode : Virtual Requirement Job Description Job Title: Spark Scala Architect / Bigdata Experience: 12 to 16 Years Location: Bangalore, Pune, Hyderabad, Mumbai Notice Period: Immediate to 15 Days Maximum Employment Type: Full-Time We are looking for an experienced Databricks + PySpark Architect to lead the design and implementation of advanced data processing solutions on cloud. The ideal candidate will have a strong background in big data architecture , Databricks , and PySpark , with a solid understanding of AWS services . Core Roles & Responsibilities: Architect and implement scalable data pipelines using Databricks and PySpark Lead end-to-end architecture and solution design for large-scale data platforms Collaborate with stakeholders to understand business requirements and translate them into technical solutions Optimize performance and scalability of data engineering workflows Integrate and deploy solutions on AWS cloud using services like S3, Glue, EMR, Lambda, etc. Ensure best practices for data security, governance, and compliance Guide and mentor development teams in big data technologies and architecture Primary Skill: Expertise in Databricks and PySpark Strong hands-on experience with data engineering on cloud platforms Secondary Skill: Proficiency with AWS services for data processing and storage Familiarity with DevOps practices and CI/CD pipelines on cloud
Posted 3 days ago
1.0 - 2.0 years
3 - 4 Lacs
Madurai
Work from Office
We are looking for a highly motivated and experienced Branch Receivable Officer to join our team at Equitas Small Finance Bank. The ideal candidate will have 1-2 years of experience in the BFSI industry, preferably with knowledge of Assets, Inclusive Banking, SBL, Mortgages, and Receivables. Roles and Responsibility Manage and oversee branch receivables operations for timely and accurate payments. Develop and implement strategies to improve receivables management and reduce delinquencies. Collaborate with cross-functional teams to resolve customer complaints and issues. Analyze and report on receivables performance metrics to senior management. Ensure compliance with regulatory requirements and internal policies. Maintain accurate records and reports of receivables transactions. Job Requirements Strong understanding of financial concepts, including accounting and auditing principles. Excellent communication and interpersonal skills for effective customer interaction. Ability to work in a fast-paced environment and meet deadlines. Proficiency in MS Office and other relevant software applications. Strong analytical and problem-solving skills to identify areas for improvement. Experience working with diverse stakeholders, including customers, colleagues, and management.
Posted 3 days ago
3.0 - 8.0 years
4 - 8 Lacs
Mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Data Engineering, Databricks Unified Data Analytics PlatformMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are efficient, scalable, and aligned with business objectives. You will also monitor and optimize existing data processes to enhance performance and reliability, while staying updated with the latest industry trends and technologies to continuously improve data management practices. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze data requirements.- Design and implement data models that support business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Data Engineering, Databricks Unified Data Analytics Platform.- Strong understanding of ETL processes and data integration techniques.- Experience with data quality assurance and data governance practices.- Familiarity with cloud-based data solutions and architecture. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
3.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also be responsible for troubleshooting issues and providing guidance to team members, fostering a collaborative environment that encourages innovation and efficiency in application development. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to support their professional growth. Professional & Technical Skills: - Good to have skills - AWS S3, DeltaLake, Airflow- Experience should be 4+ years in Python- Candidate must be a strong Hands-on senior Developer- Candidate must possess good technical / non-technical communication skills to highlight areas of concern/risks- Should have good troubleshooting skills to do RCA of prod support related issues Additional Information:- The candidate should have minimum 3 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full time education is required.- Candidate must be willing to work in Shift B i.e. from 11 AM IST to 9PM IST. Also, do the weekend support as per a pre-agreed rota. Compensation holiday may be provided for the weekend shift Qualification 15 years full time education
Posted 3 days ago
2.0 - 7.0 years
11 - 16 Lacs
Gurugram
Work from Office
Management Level: Ind & Func AI Decision Science Analyst Job Location: Bangalore / Gurgaon Must-have Skills: Life Sciences/Pharma/Healthcare projects and delivering successful outcomes, commercial, clinical, Statistical Models/Machine Learning including Segmentation & predictive modeling, hypothesis testing, multivariate statistical analysis, time series techniques, and optimization. Good-to-have Skills: Proficiency in Programming languages such as R, Python, SQL, Spark, AWS, Azure, or Google Cloud for deploying and scaling language models, Data Visualization tools like Tableau, Power BI Job Summary We are seeking an experienced and visionary - Accenture S&C Global Network - Data & AI practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition. Key Responsibilities Support delivery of small to medium-sized teams to deliver consulting projects for global clients. An opportunity to work on high-visibility projects with top Pharma clients around the globe. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge, and capabilities. Responsibilities may include strategy, implementation, process design, and change management for specific modules. Work with the team or as an Individual contributor on the project assigned which includes a variety of skills to be utilized from Data Engineering to Data Science Develop assets and methodologies, point-of-view, research, or white papers for use by the team and the larger community. Work on variety of projects in Data Modeling, Data Engineering, Data Visualization, Data Science etc., Acquire new skills that have utility across industry groups. Additional Information Ability to solve complex business problems and deliver client delight. Strong writing skills to build points of view on current industry trends. Good communication, interpersonal, and presentation skills About Our Company | Accenture (do not remove the hyperlink) Qualification Experience: Proven experience (2+ years) in working on Life Sciences/Pharma/Healthcare projects and delivering successful outcomes. Educational Qualification: Bachelors or Masters degree in Statistics, Data Science, Applied Mathematics, Business Analytics, Computer Science, Information Systems, or other Quantitative field.
Posted 3 days ago
2.0 - 3.0 years
5 - 9 Lacs
Kochi
Work from Office
Job Title - Data Engineer Sr.Analyst ACS Song Management Level:Level 10- Sr. Analyst Location:Kochi, Coimbatore, Trivandrum Must have skills:Python/Scala, Pyspark/Pytorch Good to have skills:Redshift Job Summary Youll capture user requirements and translate them into business and digitally enabled solutions across a range of industries. Your responsibilities will include: Roles and Responsibilities Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional and Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 2-3 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering About Our Company | Accenture (do not remove the hyperlink)Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)
Posted 3 days ago
3.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Palantir Foundry Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress, address any challenges that arise, and facilitate communication among team members to foster a productive work environment. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior professionals to support their growth and development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Palantir Foundry.- Strong understanding of application design and development principles.- Experience with data integration and management within Palantir Foundry.- Ability to troubleshoot and resolve application-related issues effectively.- Familiarity with agile methodologies and project management practices. Additional Information:- The candidate should have minimum 3 years of experience in Palantir Foundry.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 7 to 12 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently T echnical Skills (Must Have) ETL: Design, develop, and maintain scalable and efficient data processing pipelines using Python and Spark. Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive for data storage and processing. Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Any Relational Databases (Oracle, MSSQL, MySQL) including SQL & performance tuning. Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in Python programming language. DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management. Technical Skills (Valuable/Good To Have) ETL: Design, develop, and maintain scalable and efficient data processing pipelines using Python and Spark. Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive for data storage and processing. Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Any Relational Databases (Oracle, MSSQL, MySQL) including SQL & performance tuning. Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in Python programming language. DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management. Cloud: Experience with AWS Cloud Platform. DevOps: Experience with CI/CD Tools like LSE (Light Speed Enterprise), Jenkins, GitHub. Certification on any of the above topics would be an advantage Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 3 days ago
3.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : MySQL, Python (Programming Language), Google BigQueryMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with MySQL, Python (Programming Language), Google BigQuery.- Strong understanding of data processing frameworks and distributed computing.- Experience in developing and deploying applications in cloud environments.- Familiarity with version control systems such as Git. Additional Information:- The candidate should have minimum 3 years of experience in Apache Spark.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
7.0 - 12.0 years
9 - 13 Lacs
Chennai
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities:- Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data pipelines using Databricks Unified Data Analytics Platform.- Design and implement data security and access controls using Databricks Unified Data Analytics Platform.- Troubleshoot and resolve issues related to data platform components using Databricks Unified Data Analytics Platform. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with other big data technologies such as Hadoop, Spark, and Kafka.- Strong understanding of data modeling and database design principles.- Experience with data security and access controls.- Experience with data pipeline development and maintenance.- Experience with troubleshooting and resolving issues related to data platform components. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Bangalore, Hyderabad, Chennai and Pune Offices.- Mandatory office (RTO) for 2- 3 days and have to work on 2 shifts (Shift A- 10:00am to 8:00pm IST and Shift B - 12:30pm to 10:30 pm IST) Qualification Engineering graduate preferably Computer Science graduate 15 years of full time education
Posted 3 days ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 7 to 12 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently Technical Skills (Must Have) ETL: Design, develop, and maintain scalable and efficient data processing pipelines using Python and Spark. Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive for data storage and processing. Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Any Relational Databases (Oracle, MSSQL, MySQL) including SQL & performance tuning. Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in Python programming language. DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management. Technical Skills (Valuable/Good To Have) ETL: Design, develop, and maintain scalable and efficient data processing pipelines using Python and Spark. Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive for data storage and processing. Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Any Relational Databases (Oracle, MSSQL, MySQL) including SQL & performance tuning. Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in Python programming language. DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management. Cloud: Experience with AWS Cloud Platform. DevOps: Experience with CI/CD Tools like LSE (Light Speed Enterprise), Jenkins, GitHub. Certification on any of the above topics would be an advantage Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 3 days ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to guarantee the quality and functionality of the applications you create, while continuously seeking ways to enhance existing systems and processes. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in code reviews to ensure adherence to best practices and standards. Professional & Technical Skills: - Good to have skills - Pyspark, AWS, Airflow, Databricks, SQL, SCALA- Experience should be 4+ years in Python- Candidate must be a strong Hands-on senior Developer - Candidate must possess good technical / non-technical communication skills to highlight areas of concern/risks- Should have good troubleshooting skills to do RCA of prod support related issues Additional Information:- The candidate should have minimum 3 years of experience in Python (Programming Language).- This position is based at our Bengaluru office.- A 15 years full time education is required.- Candidate must be willing to work in Shift B i.e. daily 9PM/10 PM IST Qualification 15 years full time education
Posted 3 days ago
15.0 - 20.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Fabric Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that project goals are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also be responsible for maintaining communication with stakeholders to provide updates and gather feedback, ensuring that the applications align with business needs and technical requirements. Your role will require a balance of technical expertise and leadership skills to drive successful project outcomes. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: Should have exposure in Azure Data Components Azure Data factory and Azure Data LakeBuilding ETL processes to extract, transform, and load data into the data modelsDeveloping and maintaining data pipelines and integration workflowsTroubleshooting and resolving issues related to data models, ETL processes, and reportingDesign and implement data pipelines for ingestion, transformation, and loading (ETL/ELT) using Fabric Data Factory Pipeline and Dataflows Gen2.Develop scalable and reliable solutions for batch data integration across various structured and unstructured data sources.Oversee the development of data pipelines for smooth data flow into the Fabric Data Warehouse.Implement and maintain data solutions in Fabric Lakehouse and Fabric Warehouse. Additional Information:- The candidate should have minimum 7.5 years of experience in Microsoft Fabric.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
3.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : MySQL, Python (Programming Language), Google BigQueryMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to guarantee the quality of the applications you create, while continuously seeking ways to enhance functionality and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather and analyze requirements for application development.- Participate in code reviews to ensure adherence to best practices and coding standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with MySQL, Python (Programming Language), Google BigQuery.- Strong understanding of data processing and transformation techniques.- Experience with application development frameworks and methodologies.- Familiarity with cloud computing platforms and services. Additional Information:- The candidate should have minimum 3 years of experience in Apache Spark.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
15.0 - 20.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Spring Boot Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications are aligned with business objectives. You will also engage in problem-solving discussions and contribute to the overall success of the projects by implementing effective solutions. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - DS & Algo, Java 17/Java EE, Spring Boot, CICD- Web-Services using RESTful, Spring framework, Caching techniques, PostgreSQL SQL, Junit for testing, and containerization with Kubernetes/Docker. Airflow, GCP, Spark, Kafka - Hands on experiencing in building alerting/monitoring/logging for micro services using frameworks like Open Observe/Splunk, Grafana, Prometheus Additional Information:- The candidate should have minimum 5 years of experience in Spring Boot.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
15.0 - 20.0 years
10 - 14 Lacs
Navi Mumbai
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Python (Programming Language), AWS Architecture, Apache Spark, PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will be pivotal in driving innovation and efficiency within the application development lifecycle, fostering a collaborative environment that encourages team growth and success. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language), Apache Spark, AWS Architecture, PySpark.- Strong understanding of software development methodologies.- Experience with application design and architecture.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Python (Programming Language).- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
1.0 years
0 Lacs
Gurugram, Haryana, India
On-site
🚀 Job Description: Sales Development Representative (SDR) | AI SaaS Startup 🚀 📍 Location: Gurgaon 📌 Type: Full-Time 🔥 About Darwix AI – The Fastest Growing AI-Powered Sales Tech Platform At Darwix AI , we are not just selling a product—we are transforming the way businesses sell. Our Gen-AI-powered sales enablement suite is redefining how enterprises engage customers, close deals, and scale revenue using AI-driven automation and insights. Backed by top VCs and AI pioneers, Darwix AI is growing at lightning speed , helping enterprise clients across the world unlock the true power of AI in sales. 💡 This is a once-in-a-lifetime opportunity for ambitious SDRs to work at the intersection of AI and SaaS, engage with top enterprise clients, and build a career in high-growth tech sales. This role is not for everyone. It’s for high-energy, high-performance individuals who thrive in fast-paced environments, love sales, and want to build relationships with the top leadership at Fortune 500 companies. 🚀 Role Overview – Own the Pipeline, Engage Enterprise Clients, Build the Future As a Sales Development Representative (SDR) at Darwix AI , you will be responsible for: ✅ Identifying and prospecting enterprise clients across key geographies (MENA, US, India). ✅ Engaging with CXOs, Heads of Sales, and Revenue Leaders to introduce AI-powered sales tech. ✅ Driving the top of the sales funnel, booking high-value meetings, and qualifying strategic deals. 💡 This role is the ultimate launchpad for a successful career in SaaS sales. If you want to: 🔥 Work at the hottest AI SaaS startup 🚀 Engage with Fortune 500 and high-growth enterprise clients 💰 Unlock massive earnings potential 💡 Learn from the best in enterprise sales Then this is your defining opportunity. 🔥 Key Responsibilities – Own Outbound Sales, Book Enterprise Meetings, Build Pipeline 1️⃣ Prospecting & Lead Generation Research, identify, and engage potential enterprise clients using LinkedIn Sales Navigator, Apollo, Crunchbase, and other prospecting tools. Cold-call, email, and connect with senior decision-makers (CXOs, VPs, Directors). Execute highly personalized outbound campaigns to grab attention and spark interest. 2️⃣ Sales Engagement & Lead Qualification Engage prospects in meaningful conversations to understand their sales challenges and how AI can help. Qualify leads based on enterprise buying criteria and hand over the most promising accounts to the Account Executive team. Strategically book meetings with high-value prospects to drive sales conversations. 3️⃣ Enterprise Relationship Building & Strategic Outreach Connect and engage with top enterprise decision-makers in India, MENA, and US. Become an expert in AI-driven sales enablement to pitch the value of Darwix AI effectively. Use advanced sales intelligence tools to track buyer intent and tailor outreach. 4️⃣ CRM & Pipeline Management Maintain accurate and up-to-date lead records in the CRM (HubSpot/Salesforce). Own and manage a high-volume sales pipeline to ensure consistent revenue growth. Follow up with warm leads and nurture relationships to convert prospects into sales opportunities. 5️⃣ Continuous Learning & Sales Excellence Master SaaS and AI sales strategies through hands-on training and coaching. Work closely with marketing and product teams to refine messaging and target high-intent prospects. Attend industry events, webinars, and networking opportunities to build credibility and expand outreach. ✅ Who Should Apply? (This Role is NOT for Everyone) 🔍 High-Performance Sales Professionals You love sales, relationship-building, and winning deals. You are competitive, proactive, and thrive on hitting targets. ⚙️ Enterprise SaaS Sales Hustlers You are comfortable reaching out to CXOs and decision-makers. You understand SaaS sales cycles and how to drive engagement. 🚀 Startup Hustlers & Execution-Obsessed Individuals You thrive in a fast-growing, high-intensity environment. You are ambitious and want to build a high-growth career in AI-driven sales. 📌 Qualifications – If You Don’t Meet These, Work Harder 1-3 years of experience in sales, lead generation, or business development. Experience in SaaS, AI, B2B tech, or enterprise sales is a strong plus. Track record of exceeding sales targets and booking high-quality meetings. Proficiency in CRM tools (HubSpot, Salesforce) and outbound prospecting tools. Strong communication and interpersonal skills, with the ability to engage CXOs and senior executives. Self-motivated, highly disciplined, and results-driven mindset. Ability to learn quickly, adapt, and excel in a high-growth SaaS environment. 💰 What You’ll Get – If You Can Handle It 🔥 Work at one of the fastest-scaling AI SaaS startups. 🚀 Engage directly with enterprise clients and top leadership. 📈 Uncapped earning potential – the more meetings you book, the more you earn. 💡 Accelerated career growth into Account Executive and beyond. 💰 Competitive base salary + performance-based incentives + high commissions. ⚠️ Final Warning – This Role is NOT for Everyone 🚫 If you are not comfortable reaching out to high-level executives, this is NOT for you. 🚫 If you need micromanagement to get things done, this is NOT for you. 🚫 If you don’t thrive in a fast-paced, competitive sales environment, this is NOT for you. 💡 But if you are a driven sales professional looking to make a mark in AI and SaaS, this is YOUR defining opportunity. 👉 Apply Now & Be Part of the AI Sales Revolution at Darwix AI! 🚀
Posted 3 days ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Spring Boot Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are built to the highest standards of quality and performance. You will also participate in discussions to refine project goals and contribute to the overall success of the team. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and design.- Engage in code reviews to ensure adherence to best practices and standards. Professional & Technical Skills: - DS & Algo, Java 17/Java EE, Spring Boot, CICD- Web-Services using RESTful, Spring framework, Caching techniques, PostgreSQL SQL, Junit for testing, and containerization with Kubernetes/Docker. Airflow, GCP, Spark, Kafka - Hands on experiencing in building alerting/monitoring/logging for micro services using frameworks like Open Observe/Splunk, Grafana, Prometheus Additional Information:- The candidate should have minimum 3 years of experience in Spring Boot.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
3.0 - 8.0 years
2 - 5 Lacs
Bengaluru
Work from Office
Project Role : Quality Engineer (Tester) Project Role Description : Enables full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Performs continuous testing for security, API, and regression suite. Creates automation strategy, automated scripts and supports data and environment configuration. Participates in code reviews, monitors, and reports defects to support continuous improvement activities for the end-to-end testing process. Must have skills : Selenium Good to have skills : Payments Fundamentals, Java Standard Edition, Microsoft SQL ServerMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Quality Engineer (Tester), you will enable full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. You will perform continuous testing for security, API, and regression suite, create automation strategy, automated scripts, and support data and environment configuration. Participate in code reviews, monitor, and report defects to support continuous improvement activities for the end-to-end testing process. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to ensure quality throughout the software development lifecycle.- Develop and execute automated test scripts for regression testing.- Analyze test results and provide feedback to the development team.- Identify and report software defects to assist in maintaining product quality.- Implement and maintain test automation frameworks for efficient testing processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Selenium.- Good To Have Skills: Experience with Java Standard Edition, Microsoft SQL Server, Payments Fundamentals.- Strong understanding of test automation principles and best practices.- Experience in creating and maintaining automated test scripts.- Knowledge of software testing methodologies and tools.- Ability to troubleshoot and debug issues in test automation scripts. "Data Testing (primary skill ) and UI automation testing (secondary skill). Candidate should be having experience in both primary(80% -85%) and secondary skill (20%-15%). Must have skills:Big data testing with python and spark Cloud testing hands-on (AWS is preferred)Experience in SPARK_SQL (Not looking for RDBMS SQL) Python scripting experienceUI Automation testing experience "- Cypress, JavaScript, Payments Fundamental.- Java, SQL, API integration, Selenium and Cucumber framework.Exposure working on TeamCity, JIRA & GIT.- Expertise on Java, Spring, PL/SQL, SQL, Multithreading, Unit TestingExposure working on TeamCity, JIRA & GITShould have a basic understanding of Equities Derivatives Additional Information:- The candidate should have a minimum of 3 years of experience in Selenium.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
15.0 - 20.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : OneTrust Privacy Management Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide innovative solutions to enhance data accessibility and usability. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Develop and optimize data pipelines to ensure efficient data flow and processing.- Monitor and troubleshoot data quality issues, implementing corrective actions as necessary. Professional & Technical Skills: - Must To Have Skills: Proficiency in OneTrust Privacy Management.- Good To Have Skills: Experience with Data Governance.- Strong understanding of data architecture and data modeling principles.- Experience with ETL tools and data integration techniques.- Familiarity with data quality frameworks and best practices. Additional Information:- The candidate should have minimum 7.5 years of experience in OneTrust Privacy Management.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
1.0 - 2.0 years
3 - 4 Lacs
Hanumangarh, Sujangarh, Jodhpur
Work from Office
We are looking for a highly skilled and experienced Relationship Manager to join our team at Equitas Small Finance Bank. The ideal candidate will have 1-2 years of experience in retail mortgages and relationship management, with a strong background in these areas. Roles and Responsibility Manage and maintain relationships with existing clients to ensure customer satisfaction and retention. Identify new business opportunities and develop strategies to acquire new customers. Conduct market research and competitor analysis to stay informed about industry trends. Collaborate with internal teams to provide comprehensive solutions to clients. Develop and implement sales plans to meet or exceed monthly and quarterly targets. Provide excellent customer service and support to resolve client queries and concerns. Job Requirements Strong knowledge of retail mortgages and relationship management principles. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment and meet deadlines. Strong analytical and problem-solving skills. Experience working with diverse client groups and providing inclusive banking services. Familiarity with SBL's retail mortgage products and services is an added advantage.
Posted 3 days ago
0 years
0 Lacs
Greater Kolkata Area
Remote
Chess Teacher Contributor (Freelance | Remote) Location: Work From Home (WFH) Type: Hourly-based | Freelance Shift Timings: UK & Australia Shift (Evening/Night IST) 🎯 About the Role: Connect2Learn is looking for a passionate and engaging Chess Teacher Contributor to join our growing global team. This is a freelance, remote opportunity ideal for individuals who love teaching chess and have prior experience handling international students , especially from the UK and Australia . 🧩 Key Responsibilities: Conduct live online chess sessions with international students (one-on-one or group) Create a fun, interactive, and concept-driven learning environment Track student progress and adapt lessons accordingly Communicate with parents (if required) and provide session feedback Maintain high-quality teaching standards aligned with our curriculum ✅ Requirements: Prior experience teaching chess to international students , especially UK & AUS regions Strong communication skills in English (spoken & written) Must be interactive, patient, and student-focused Ability to manage students across various age groups and skill levels Proficiency with online teaching platforms (Google Meet) Freelance mindset – responsible and self-driven Available for UK & AUS time zones 💰 Compensation: Hourly-based payout (discussed during the interview) Payment on a monthly/bi-weekly basis (based on class reports) 📩 How to Apply: If you're a chess enthusiast with a teaching spark and international exposure, we’d love to hear from you! Fill out this form - https://docs.google.com/forms/d/e/1FAIpQLSfb6zYkZ69OgvuKPA_ISqADOspCMEkawKAvBritUjW9UIo2BQ/viewform Or connect with HR Rinika Sikdar ( +9153759348 ) / hr@connect2learn.in
Posted 3 days ago
3.0 - 8.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Microsoft Power Business Intelligence (BI), Microsoft Azure DatabricksMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be involved in analyzing data requirements and translating them into effective solutions that align with the overall data strategy of the organization. Your role will require you to stay updated with the latest trends in data engineering and contribute to the continuous improvement of data processes and systems. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in the design and implementation of data pipelines to support data integration and analytics.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Microsoft Azure Databricks, Microsoft Power Business Intelligence (BI).- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data warehousing solutions.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
3.0 - 8.0 years
9 - 13 Lacs
Chennai
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Microsoft Power Business Intelligence (BI), Microsoft Azure DatabricksMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be involved in analyzing data requirements and translating them into effective solutions, ensuring that the data platform meets the needs of various stakeholders. Additionally, you will participate in team meetings to share insights and contribute to the overall strategy of the data platform. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest trends and technologies in data platforms.- Assist in the documentation of data architecture and integration processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Microsoft Power Business Intelligence (BI), Microsoft Azure Databricks.- Strong understanding of data integration techniques and methodologies.- Experience with data modeling and database design principles.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France