Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
0 Lacs
Mangaluru, Karnataka, India
On-site
Responsibilities Facilitate sprint planning, daily stand-up meetings, reviews/demos, and retrospectives for multiple teams in stable environments. Track and communicate team commitments, velocity and sprint/quarterly progress. Collaborate across scrum teams in R&D to help remove team impediments. Escalate team impediments that the team are unable to resolve themselves, or with your help. Relentlessly driving continuous improvement through: Retrospectives, learning, giving feedback, challenging the team, and coaching others to do so as well. Continuous learning on agile techniques such as story mapping, CI/CD, BDD, TDD, Continuous Testing, Pairing, Automation. Highlight team success within and outside the team. Plan and facilitate quarterly planning events. Present and communicate to product development executives and leadership. Coach team members: To improve collaboration and self-organization On Agile practices and encourage inspection and adaptation To ensure backlog is refined (properly defined User Stories) Qualifications Self-organizing, thorough, and efficient One or more of the following certifications: CSM, Advanced CSM, SSM, PSM I, PSM II, PMS III, SAFe Certification At least 4+ years of technical experience and 2+ years as Scrum Master of combined QA/Dev scrum teams. Ability to establish a data-driven culture and a repeatable, structured, and disciplined approach to the agile process. Ability to manage the dependencies between team capacity, prioritization, software quality and committed deadlines. Strong knowledge and certification in Agile methodology and framework such as Lean, Kanban, XP, etc. Experience with JIRA/Confluence/SharePoint preferred and JIRA/Azure DevOps/SharePoint admin skills a bonus. Fluency with PowerPoint, Excel, and Word. Minimum of a bachelor’s degree or equivalent.
Posted 1 day ago
2.0 - 4.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Position Overview We are seeking a Business Analyst with a strong foundation in business acumen, finance, and data analytics to support our US operations. This role is ideal for someone who combines technical expertise in SQL, Power BI, and data tools with the ability to interpret business trends and deliver actionable insights. The successful candidate will possess excellent project management and communication skills to bridge the gap between analytics and business strategy. Key Responsibilities - Partner with US-based teams to analyze operational data, financial metrics, and performance KPIs. - Develop and maintain dashboards, scorecards, and automated reports using Power BI and other data visualization tools. - Provide strategic insights and recommendations to improve operational efficiency and business performance. - Lead and support data-driven projects from requirements gathering to final delivery. - Perform ad-hoc analyses to address operational and strategic business questions. - Ensure data integrity and accuracy while building and maintaining datasets and analytical models. - Collaborate with business and technology teams to define metrics and reporting standards. - Translate complex analytical findings into clear, actionable presentations for leadership teams. - Manage timelines and deliverables for multiple cross-functional projects, ensuring alignment with business objectives. Qualifications & Skills - Graduate degree in business or finance (MBA or Master's in Finance preferred). - 2-4 years of experience in business analysis, data analytics, or operations analytics. - Strong SQL skills for querying and data manipulation. - Advanced proficiency in Power BI and other data visualization/reporting tools (e.g., Tableau, Excel). - Strong understanding of business strategy and financial principles. - Experience with project management methodologies and tools. - Excellent problem-solving, analytical thinking, and critical reasoning abilities. - Exceptional written and verbal communication skills, with the ability to present to leadership. - Familiarity with US operational processes or exposure to global teams is a plus. Preferred Attributes - Demonstrated ability to connect data insights to business strategy and ROI. - Knowledge of cloud-based data platforms (e.g., Azure, Snowflake) is an advantage. - A proactive mindset with a "consultative" approach to solving business challenges. - Strong time management and ability to work across multiple time zones.
Posted 1 day ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Sapiens is on the lookout for a Analyst (Service Desk) to become a key player in our Bangalore team. If you're a Service Desk pro and ready to take your career to new heights with an established, globally successful company, this role could be the perfect fit. Location: Bangalore, India Working Model: Our flexible work arrangement combines both remote and in-office work, optimizing flexibility and productivity. What You’ll Do First point of contact for customers, internal organisation users for reporting issues and requests via calls, emails, monitoring tools and self-service channels Collect complete details of the issue being reported, log a ticket on the ticketing tool. Carry out initial triaging to ensure ticket has complete information, appropriate priority and impact details, appropriate support team for assignment. Ensure associated internal tickets are created with complete information and appropriate support team assignment. Regular follow up on tickets to ensure progress is being made by support teams; up to date progress updates communicated to customers, appropriate usage of the tickets state based on progress. Manage tickets lifecycle end to end adhering to process in place. Proactively monitor ServiceDesk ticket queue and take appropriate timely actions. End to end coordination with stakeholders including customers, project teams, DEV teams, Infrastructure teams wherever necessary to make progress on tickets and ongoing issues and requests on emails. Follow on call process including multiple teams without miss. Participate in Major Incident Management bridge calls along with Incident Manager to drive towards resolution with the help of support teams. Coordinate RCA document preparation and submission within set timelines. End to end Service Request Management. What To Have For This Position. Must have Skills. Total Experience of at least 2+ years in 24/7 ServiceDesk environment with application support experience Good hands-on and knowledge on Incident and Major Incident Management, Service request Management processes Good knowledge on Problem management process Good hands-on experience in working in a dynamic high-pressure environment dealing with multiple aspects at the same time Good hands-on experience in ticketing tools like ServiceNow. Good Analytical and problem-solving skills with Strong interpersonal, facilitation skills along with effective communication (both written and verbal) skills Ready to work in rotational shifts and offs without any issues Team player who believes in collaborating with team members and lending extra support wherever needed Good To Have Skills. ITIL V4 foundation knowledge and preferably certified. ServiceDesk background and experience Basic knowledge of networking monitoring – Good to have Windows administration Basics knowledge – Good to have Basic understanding of technologies like Cloud (Azure/AWS), Database, SQL, scripting – Good to have Required Skills. Good Communication skills both verbal and written 24/7 ServiceDesk background ITIL knowledge and background Ready to work in 24/7 team with rotational shifts About Sapiens Sapiens is a global leader in the insurance industry, delivering its award-winning, cloud-based SaaS insurance platform to over 600 customers in more than 30 countries. Sapiens ’ platform offers pre-integrated, low-code capabilities to accelerate customers ’ digital transformation. With more than 40 years of industry expertise, Sapiens has a highly professional team of over 5,000 employees globally. For More information visit us on www.sapiens.com . Sapiens is an equal opportunity employer. We value diversity and strive to create an inclusive work environment that embraces individuals from diverse backgrounds. Disclaimer: Sapiens India does not authorise any third parties to release employment offers or conduct recruitment drives via a third party. Hence, beware of inauthentic and fraudulent job offers or recruitment drives from any individuals or websites purporting to represent Sapiens . Further, Sapiens does not charge any fee or other emoluments for any reason (including without limitation, visa fees) or seek compensation from educational institutions to participate in recruitment events. Accordingly, please check the authenticity of any such offers before acting on them and where acted upon, you do so at your own risk. Sapiens shall neither be responsible for honouring or making good the promises made by fraudulent third parties, nor for any monetary or any other loss incurred by the aggrieved individual or educational institution. In the event that you come across any fraudulent activities in the name of Sapiens , please feel free report the incident at sapiens to sharedservices@sapiens.com .
Posted 1 day ago
2.0 years
4 - 5 Lacs
Saki, Mumbai, Maharashtra
On-site
Role - IT Project Coordinator Experience - 2+yoe * Assist in project planning, scheduling, and tracking to ensure timely execution. * Coordinate with cross-functional teams (developers, testers, designers, and clients) to align project goals. * Monitor project progress, risks, and bottlenecks, providing regular updates to stakeholders. * Maintain project documentation (timelines, reports, meeting notes, etc.). * Handle resource allocation. * Facilitate team communication via stand-ups, meetings, and reporting tools (JIRA, Azure, Asana, etc.). * Ensure project deliverables comply with quality standards and client expectations. * Identify and mitigate risks to avoid delays or project failures. Job Type: Full-time Pay: ₹35,000.00 - ₹45,000.00 per month Benefits: Provident Fund Experience: IT project management: 2 years (Preferred) Jira: 2 years (Preferred) Team management: 1 year (Preferred) Work Location: In person Speak with the employer +91 8826276401
Posted 1 day ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Introduction IBM Infrastructure is a catalyst that makes the world work better because our clients demand it. Heterogeneous environments, the explosion of data, digital automation, and cybersecurity threats require hybrid cloud infrastructure that only IBM can provide. Your ability to be creative, a forward-thinker and to focus on innovation that matters, is all support by our growth minded culture as we continue to drive career development across our teams. Collaboration is key to IBM Infrastructure success, as we bring together different business units and teams that balance their priorities in a way that best serves our client's needs. IBM's product and technology landscape includes Research, Software, and Infrastructure. Entering this domain positions you at the heart of IBM, where growth and innovation thrive. IBM Cloud Core Platform Services is a growing, agile, dynamic organization building and operating leading-edge, highly available, and distributed cloud services in IBM Cloud. We're looking for experienced cloud software engineers to join us. This technical role is focused on designing, developing and deploying cloud services, automating wide ranges of tasks, problem-solving, interfacing with other teams to solve complex problems. You will be part of a strong, agile, and modern team culture driven to create world-class cloud services, delivering an industry leading user experience for our customers. As an integral part of the development team, you will get an opportunity to contribute to the cloud services architecture and design while helping us mentor the next generation of cloud engineers. Your Role And Responsibilities Becoming an expert and major contributor for designs and implementation efforts of the IBM Cloud Platform Services ecosystem Developing highly available, distributed cloud services, with emphasis on security, scalability and user experience using technologies like Golang, Java, Node.js, Cloudant, Redis, Docker, Kubernetes, Istio and more. Reading open specifications and RFC documents and converting them to design docs and implementation Identifying opportunities and acting on improving existing tools, frameworks and workflows Documenting and sharing your experience with team members, mentoring others Preferred Education Bachelor's Degree Required Technical And Professional Expertise A minimum of a bachelor’s degree in computer science, Software Engineering or equivalent At least 3 years of hands-on development experience building applications with one or more of the following: Java, Node.js, Golang, NoSQL DB, Redis, distributed caches, containers etc. At least 3 years of experience building and operating highly secured, distributed cloud services with one or more of the following: IBM Cloud, AWS, Azure, Docker, container orchestration, performance testing, DevOps etc. At least 1 years of experience in web technologies: HTTP, REST, JSON, HTML, JavaScript etc. Solid understanding of the micro-service architecture and modern cloud programming practices. Strong ability to design a clean, developer-friendly API. Passionate about constant, continuous learning and applying new technologies as well as mentoring others. Keen troubleshooting skills and strong verbal/written communication skills. Preferred Technical And Professional Experience Bachealor's degree in computer science, Software Engineering or equivalent Knowledge of the IBM Cloud platforms or another as-a-service platform and its architecture Experience as technical lead managing team of engineers in driving development of highly scalable distributed system Proficient with one or more project management tools – Jira, Git, Aha, etc
Posted 1 day ago
0.0 - 4.0 years
0 - 1 Lacs
Delhi, Delhi
On-site
Java Developer Experience: 4 to 7 years Work Mode: Hybrid (Contractual) Locations: Bangalore / Noida / Gurgaon Key Responsibilities: * Design, develop, test, and maintain Java-based applications. * Implement scalable microservices using Spring Boot. * Write clean, efficient, and well-documented code. * Work with multithreaded systems to ensure optimal performance. * Collaborate with cross-functional teams to define, design, and deliver new features. * Participate in all phases of the Software Development Life Cycle (SDLC). * Optimize applications for performance and scalability. * Debug and resolve technical issues across environments. * Ensure best practices in coding, testing, and deployment. Required Skills: * Strong expertise in Core Java. * Hands-on experience with Multithreading and concurrent programming. * Solid understanding and experience in Spring / Spring Boot. * Good knowledge of Microservices architecture. * Strong coding and problem-solving skills. * Experience with MySQL or other relational databases. * Familiarity with the complete SDLC process. Preferred Qualifications: * Bachelor’s/Master’s degree in Computer Science, Engineering, or related field. * Exposure to cloud platforms like AWS/GCP/Azure is a plus. * Knowledge of RESTful APIs, Docker, or Kubernetes is an advantage. * Familiarity with CI/CD tools and DevOps practices. Job Type: Contractual / Temporary Contract length: 6 - 12 months Pay: ₹80,000.00 - ₹100,000.00 per month Experience: SDLC: 4 years (Required) Job Types: Full-time, Contractual / Temporary Contract length: 12 months Pay: ₹80,000.00 - ₹90,000.00 per month Experience: Java: 5 years (Required) Azure: 4 years (Required) Location: delhi, Delhi (Preferred) Work Location: In person
Posted 1 day ago
1.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Introduction Ready to grow your career in the Cloud? Want to have a feeling that you are making a difference? This is your chance to become an integral part of a dynamic team of talented professionals developing and deploying innovative, industry-leading, cloud-based platform services. IBM Cloud Platform Core Services is a growing, agile, dynamic organization building and operating leading-edge, highly-available, and distributed cloud services in IBM Cloud. We're looking for experienced cloud software engineers to join us. This technical role is focused on designing, developing and deploying cloud services, automating wide ranges of tasks, problem-solving, interfacing with other teams to solve complex problems. You will be part of a strong, agile, and modern team culture driven to create world-class cloud services, delivering an industry leading user experience for our customers. As an integral part of the development team, you will get an opportunity to contribute to the cloud services architecture and design while helping us mentor the next generation of cloud engineers. Your Role And Responsibilities Becoming an expert and major contributor for designs and implementation efforts of the IBM Cloud Platform Services ecosystem Developing highly-available, distributed cloud services, with emphasis on security, scalability and user experience using technologies like Java, Node.js, Golang, Cloudant, Redis, Docker, Kubernetes, Istio and more. Identifying opportunities and acting on improving existing tools, frameworks and workflows Documenting and sharing your experience with team members, mentoring others Preferred Education Bachelor's Degree Required Technical And Professional Expertise A minimum of a bachelor degree in Computer Science, Software Engineering or equivalent 1+ year of hands-on development experience building applications with one or more of the following: Java, Node.js, Golang, NoSQL DB, Redis, distributed caches, containers etc. 1+ years of experience in web technologies: HTTP, REST, JSON, HTML, JavaScript etc. Solid understanding of the micro-service architecture and modern cloud programming practices. Strong ability to design a clean, developer-friendly API. Passionate about constant, continuous learning and applying new technologies as well as mentoring others. Keen troubleshooting skills and strong verbal/written communication skills. Preferred Technical And Professional Experience Bachelors degree in Computer Science, Software Engineering or equivalent Understanding of cybersecurity and cryptography principles, certifications, and compliance. Experience in remotely supporting customer engagements to help driving the adoption Experience building and operating highly secured, distributed cloud services with one or more of the following: IBM Cloud, AWS, Azure, Docker, container orchestration, performance testing, DevOps etc.
Posted 1 day ago
8.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview Data Analyst will be responsible to partner closely with business and S&T teams in preparing final analysis reports for the stakeholders enabling them to make important decisions based on various facts and trends and lead data requirement, source analysis, data analysis, data transformation and reconciliation activities. This role will be interacting with DG, DPM, EA, DE, EDF, PO and D &Ai teams for historical data requirement and sourcing the data for Mosaic AI program to scale solution to new markets. Responsibilities Lead data requirement, source analysis, data analysis, data transformation and reconciliation activities. Partners with FP&A Product Owner and associated business SME’s to understand & document business requirements and associated needs Performs the analysis of business data requirements and translates into a data design that satisfies local, sector and global requirements Using automated tools to extract data from primary and secondary sources. Using statistical tools to identify, analyse, and interpret patterns and trends in complex data sets could be helpful for the diagnosis and prediction. Working with engineers, and business teams to identify process improvement opportunities, propose system modifications. Proactively identifies impediments and looks for pragmatic and constructive solutions to mitigate risk. Be a champion for continuous improvement and drive efficiency. Preference will be given to candidate having functional understanding of financial concepts (P&L, Balance Sheet, Cash Flow, Operating Expense) and has experience modelling data & designing data flows Qualifications Bachelor of Technology from a reputed college Minimum 8-10 years of relevant work experience on data modelling / analytics, preferably Minimum 5-6year experience of navigating data in Azure Databricks, Synapse, Teradata or similar database technologies Expertise in Azure (Databricks, Data Factory, Date Lake Store Gen2) Proficient in SQL, Pyspark to analyse data for both development validation and operational support is critical Exposure to GenAI Good Communication & Presentation skill is must for this role.
Posted 1 day ago
3.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics D&A – SSIS- Senior We’re looking for Informatica or SSIS Engineers with Cloud Background (AWS, Azure) Primary skills: Has played key roles in multiple large global transformation programs on business process management Experience in database query using SQL Should have experience working on building/integrating data into a data warehouse. Experience in data profiling and reconciliation Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Developed expertise in complex data management or Application integration solution and deployment in areas of data migration, data integration, application integration or data quality. Experience in data processing, orchestration, parallelization, transformations and ETL Fundamentals. Leverages on variety of programming languages & data crawling/processing tools to ensure data reliability, quality & efficiency (optional) Experience in Cloud Data-related tool (Microsoft Azure, Amazon S3 or Data lake) Knowledge on Cloud infrastructure and knowledge on Talend cloud is an added advantage Knowledge of data modelling principles. Knowledge in Autosys scheduling Good experience in database technologies. Good knowledge in Unix system Responsibilities: Need to work as a team member to contribute in various technical streams of Data integration projects. Provide product and design level technical best practices Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Building a quality culture Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Qualification: BE/BTech/MCA (must) with an industry experience of 3 -7 years. Experience in Talend jobs, joblets and customer components. Should have knowledge of error handling and performance tuning in Talend. Experience in big data technologies such as sqoop, Impala, hive, Yarn, Spark etc. Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Atleast experience of minimum 3-4 clients for short duration projects ranging between 6-8 + months OR Experience of minimum 2+ clients for duration of projects ranging between 1-2 years or more than that People with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
4.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc. Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred Technical And Professional Experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Knowledge or experience of Snowflake will be an added advantage
Posted 1 day ago
6.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Data Integration Specialist – Senior The opportunity We are seeking a talented and experienced Integration Specialist with 3–6 years of experience to join our growing Digital Integration team. The ideal candidate will play a pivotal role in designing, building, and deploying scalable and secure solutions that support business transformation, system integration, and automation initiatives across the enterprise. Your Key Responsibilities Work with clients to assess existing integration landscapes and recommend modernization strategies using MuleSoft. Translate business requirements into technical designs, reusable APIs, and integration patterns. Develop, deploy, and manage MuleSoft APIs and integrations on Anypoint Platform (CloudHub, Runtime Fabric, Hybrid). Collaborate with business and IT stakeholders to define integration standards, SLAs, and governance models. Implement error handling, logging, monitoring, and alerting using Anypoint Monitoring and third-party tools. Maintain integration artifacts and documentation, including RAML specifications, flow diagrams, and interface contracts. Ensure performance tuning, scalability, and security best practices are followed across integration solutions. Support CI/CD pipelines, version control, and DevOps processes for MuleSoft assets using platforms like Azure DevOps or GitLab. Collaborate with cross-functional teams (Salesforce, SAP, Data, Cloud, etc.) to deliver end-to-end connected solutions. Stay current with MuleSoft platform capabilities and industry integration trends to recommend improvements and innovations. Troubleshoot integration issues and perform root cause analysis in production and non-production environments. Contribute to internal knowledge-sharing, technical mentoring, and process optimization. Strong SQL, data integration and handling skills Exposure to AI Models ,Python and using them in Data Cleaning/Standardization. To qualify for the role, you must have 3–6 years of hands-on experience with MuleSoft Anypoint Platform and Anypoint Studio Strong experience with API-led connectivity and reusable API design (System, Process, Experience layers). Proficient in DataWeave transformations, flow orchestration, and integration best practices. Experience with API lifecycle management including design, development, publishing, governance, and monitoring. Solid understanding of integration patterns (synchronous, asynchronous, event-driven, batch). Hands-on experience with security policies, OAuth, JWT, client ID enforcement, and TLS. Experience in working with cloud platforms (Azure, AWS, or GCP) in the context of integration projects. Knowledge of performance tuning, capacity planning, and error handling in MuleSoft integrations. Experience in DevOps practices including CI/CD pipelines, Git branching strategies, and automated deployments. Experience in data intelligence cloud platforms like Snowflake, Azure, data bricks Ideally, you’ll also have MuleSoft Certified Developer or Integration Architect certification. Exposure to monitoring and logging tools (e.g., Splunk, Elastic, Anypoint Monitoring). Strong communication and interpersonal skills to work with technical and non-technical stakeholders. Ability to document integration requirements, user stories, and API contracts clearly and concisely. Experience in agile environments and comfort working across multiple concurrent projects. Ability to mentor junior developers and contribute to reusable component libraries and coding standards. What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career. The freedom and flexibility to handle your role in a way that’s right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
10.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 1 day ago
15.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction Joining the IBM Technology Expert Labs teams means you’ll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you’ll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities As a Delivery Consultant, you will work closely with IBM clients and partners to design, deliver, and optimize IBM Technology solutions that align with your clients’ goals. In this role, you will apply your technical expertise to ensure world-class delivery while leveraging your consultative skills such as problem-solving issue- / hypothesis-based methodologies, communication, and service orientation skills. As a member of IBM Technology Expert Labs, a team that is client focused, courageous, pragmatic, and technical, you’ll collaborate with clients to optimize and trailblaze new solutions that address real business challenges. If you are passionate about success with both your career and solving clients’ business challenges, this role is for you. To help achieve this win-win outcome, a ‘day-in-the-life’ of this opportunity may include, but not be limited to… Solving Client Challenges Effectively: Understanding clients’ main challenges and developing solutions that helps them reach true business value by working thru the phases of design, development integration, implementation, migration and product support with a sense of urgency . Agile Planning and Execution: Creating and executing agile plans where you are responsible for installing and provisioning, testing, migrating to production, and day-two operations. Technical Solution Workshops: Conducting and participating in technical solution workshops. Building Effective Relationships: Developing successful relationships at all levels —from engineers to CxOs—with experience of navigating challenging debate to reach healthy resolutions. Self-Motivated Problem Solver: Demonstrating a natural bias towards self-motivation, curiosity, initiative in addition to navigating data and people to find answers and present solutions. Collaboration and Communication: Strong collaboration and communication skills as you work across the client, partner, and IBM team. Preferred Education Bachelor's Degree Required Technical And Professional Expertise In-depth knowledge of the IBM Data & AI portfolio. 15+ years of experience in software services 10+ years of experience in the planning, design, and delivery of one or more products from the IBM Data Integration, IBM Data Intelligence product platforms Experience in designing and implementing solution on IBM Cloud Pak for Data, IBM DataStage Nextgen, Orchestration Pipelines 10+ years’ experience with ETL and database technologies, Experience in architectural planning and implementation for the upgrade/migration of these specific products Experience in designing and implementing Data Quality solutions Experience with installation and administration of these products Excellent understanding of cloud concepts and infrastructure Excellent verbal and written communication skills are essential Preferred Technical And Professional Experience Experience with any of DataStage, Informatica, SAS, Talend products Experience with any of IKC, IGC,Axon Experience with programming languages like Java/Python Experience in AWS, Azure Google or IBM cloud platform Experience with Redhat OpenShift Good to have Knowledge: Apache Spark , Shell scripting, GitHub, JIRA
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
10.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description For Lead Data Engineer QA Rank – Manager Location – Bengaluru/Chennai/Kerela/Kolkata Objectives and Purpose The Lead Data Engineer QA will be responsible for testing business intelligence and data warehouse solutions, both in on-premises and cloud platforms. We are seeking an innovative and talented individual who can create test plans, protocols, and procedures for new software. In addition, you will be supporting build of large-scale data architectures that provide information to downstream systems and business users. Your Key Responsibilities Design and execute manual and automatic test cases, including validating alignment with ELT data integrity and compliance. Support conducting QA test case designs, including identifying opportunities for test automation and developing scripts for automatic processes as needed. Follow quality standards, conduct continuous monitoring and improvement, and manage test cases, test data, and defect processes using a risk-based approach as needed. Ensure all software releases meet regulatory standards, including requirements for validation, documentation, and traceability, with particular emphasis on data privacy and adherence to infrastructure security best practices. Proactively foster strong partnerships across teams and stakeholders to ensure alignment with quality requirements and address any challenges. Implement observability within testing processes to proactively identify, track, and resolve quality issues, contributing to sustained high-quality performance. Establish methodology to test effectiveness of BI and DWH projects, ELT reports, integration, manual and automation functionality Work closely with product team to monitor data quality, integrity, and security throughout the product lifecycle, implementing data quality checks to ensure accuracy, completeness, and consistency. Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity. Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS native technologies, to support continuing increases in data source, volume, and complexity. Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment. Establish methodology to test effectiveness of BI and DWH projects, ELT reports, integration, manual and automation functionality. Implement processes and systems to provide accurate and available data to key stakeholders, downstream systems, and business processes. Partner with Business Analytics and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Mentor and coach junior Data Engineers on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. To qualify for the role, you must have the following: Essential Skillsets Bachelor’s degree in Engineering, Computer Science, Data Warehousing, or related field 10+ years of experience in software development, data science, data engineering, ETL, and analytics reporting development Understanding of project and test lifecycle, including exposure to CMMi and process improvement frameworks Experience designing, building, implementing, and maintaining data and system integrations using dimensional data modelling and development and optimization of ETL pipelines Proven track record of designing and implementing complex data solutions Understanding of business intelligence concepts, ETL processing, dashboards, and analytics Testing experience in Data Quality, ETL, OLAP, or Reports Knowledge in Data Transformation Projects, including database design concepts & white box testing Experience in cloud based data solution – AWS/Azure Demonstrated understanding and experience using: Cloud-based data solutions (AWS, IICS, Databricks) GXP and regulatory and risk compliance Cloud AWS infrastructure testing Python data processing SQL scripting Test processes (e.g., ELT testing, SDLC) Power BI/Tableau Script (e.g., perl and shell) Data Engineering Programming Languages (i.e., Python) Distributed Data Technologies (e.g., Pyspark) Test Management and Defect Management tools (e.g., HP ALM) Cloud platform deployment and tools (e.g., Kubernetes) DevOps and continuous integration Databricks/ETL Understanding of database architecture and administration Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases Processes high proficiency in code programming languages (e.g., SQL, Python, Pyspark, AWS services) to design, maintain, and optimize data architecture/pipelines that fit business goals Strong organizational skills with the ability to manage multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners Strong problem solving and troubleshooting skills Ability to work in a fast-paced environment and adapt to changing business priorities EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
Kochi, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
12.0 years
0 Lacs
Kochi, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Job Description Automation Title Data Architect Type of Employment Permanent Overall Years Of Experience 12-15 years Relevant Years Of Experience 10+ Data Architect Data Architect is responsible for designing and implementing data architecture for multiple projects and also build strategies for data governance Position Summary 12 – 15 yrs of experience in a similar profile with strong service delivery background Experience as a Data Architect with a focus on Spark and Data Lake technologies. Experience in Azure Synapse Analytics Proficiency in Apache Spark for large-scale data processing. Expertise in Databricks, Delta Lake, Azure data factory, and other cloud-based data services. Strong understanding of data modeling, ETL processes, and data warehousing principles. Implement a data governance framework with Unity Catalog . Knowledge in designing scalable streaming data pipeline using Azure Event Hub, Azure Stream analytics, Spark streaming Experience with SQL and NoSQL databases, as well as familiarity with big data file formats like Parquet and Avro. Hands on Experience in python and relevant libraries such as pyspark, numpy etc Knowledge of Machine Learning pipelines, GenAI, LLM will be plus Excellent analytical, problem-solving, and technical leadership skills. Experience in integration with business intelligence tools such as Power BI Effective communication and collaboration abilities Excellent interpersonal skills and a collaborative management style Own and delegate responsibilities effectively Ability to analyse and suggest solutions Strong command on verbal and written English language Essential Roles and Responsibilities Work as a Data Architect and able to design and implement data architecture for projects having complex data such as Big Data, Data lakes etc Work with the customers to define strategy for data architecture and data governance Guide the team to implement solutions around data engineering Proactively identify risks and communicate to stakeholders. Develop strategies to mitigate risks Build best practices to enable faster service delivery Build reusable components to reduce cost Build scalable and cost effective architecture EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 1 day ago
4.0 - 9.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior- Insurance Data analyst We are seeking a highly skilled and motivated Data Analyst with experience in ETL services to join our dynamic team. As a Data analyst, you will be responsible for data requirement gathering, preparing data requirement artefacts, preparing data integration strategies, data quality, you will work closely with data engineering teams to ensure seamless data flow across our systems. Key Responsibilities: Expertise in the P&C Insurance domain. Interact with stakeholders, source teams to gather data requirements. Specialized skill in Policy and/or Claims and/or Billing insurance source systems. Thorough understanding of the life cycle of Policy and Claims. Should have good understanding of various transactions involved. Prepare data dictionaries, source to target mapping and understand underlying transformation logic Experience in any of the insurance products including Guidewire and/or Duckcreek Better understanding of Insurance data models including Policy Centre, Claim Centre and Billing Centre Create various data scenarios using the Insurance suite for data team to consume for testing Experience and/or understanding of any Insurance Statutory or Regulatory reports is an add-on Discover, design, and develop analytical methods to support novel approaches of data and information processing Perform data profiling manually or using profiling tools Identify critical data elements and PII handling process/mandates Understand handling process of historic and incremental data loads and generate clear requirements for data integration and processing for the engineering team Perform analysis to assess the quality of the data, determine the meaning of the data, and provide data facts and insights Interface and communicate with the onsite teams directly to understand the requirement and determine the optimum data intake process Responsible for creating the HLD/LLD to enable data engineering team to work on the build Provide product and design level functional and technical expertise along with best practices Required Skills and Qualifications: BE/BTech/MTech/MCA with 4 - 9 years of industry experience with data analysis, management and related data service offerings Experience in Insurance domains Strong analytical skills Strong SQL experience Good To have: Experience using Agile methodologies Experience using cloud technologies such as AWS or Azure Other Key capabilities: Client facing skills and proven ability in effective planning, executing and problem-solving Excellent communication, inter-personal, and teamworking skills Multi-tasking attitude, flexible with ability to change priorities quickly Methodical approach, logical thinking, and ability to plan work and meet deadlines Accuracy and attention to details Written and verbal communication skills Ability to plan resource requirements from high level specifications EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
6.0 years
0 Lacs
Andhra Pradesh, India
On-site
SDET Primary Skills Testing tools (Selenium, Parasoft, DB testing), Robot framework Secondary Skills C#, Python, Java JD 6 + years of Automation Testing tools (Selenium, ParaSoft, DB testing), Robot framework Demonstrated experience of automating tests across multiple platforms and technologies and understand its application throughout the full development lifecycle. In-sprint test automation Knowledge of test case management packages Demonstrated experience of hands-on Issue and Defect Management. Should have expertise in at least one programming language (such as: C#, Python, Java) in addition to Continuous Integration tools (such as: Jenkins, Azure DevOps) Responsibility 6 + years of Automation Testing tools (Selenium, Parasoft, DB testing), Robot framework Demonstrated experience of automating tests across multiple platforms and technologies and understand its application throughout the full development lifecycle. Knowledge of test case management packages Demonstrated experience of hands-on Issue and Defect Management. Should have expertise in at least one programming language (such as: C#, Python, Java) in addition to Continuous Integration tools (such as: Jenkins, Azure DevOps) Knowledge and experience with modern testing development practices and integrated testing products such as selenium and their integration with tools such as Gitlab, etc. Experience of non-functional testing as well as backup and recovery, DR, performance Experience of Agile Software Development & Testing methods for deployment in cloud environments. Previous development experience, cloud migration testing experience and Capital markets experience would be an added advantage
Posted 1 day ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Design, develop, and maintain high-performance, scalable Java applications using Java, Spring Boot and React/Angular. Build REST APIs and SDKs. Should be excellent in Java, OOPS concepts & Java Collections. Should be excellent in Spring Boot/Spring/hibernate. Strong proficiency in Java and related frameworks (e.g., Spring, Hibernate). Should have worked on REST API implementation and microservices implementation Experience with cloud platforms (e.g., AWS, Azure, Google Cloud). Experience in AWS, Docker and Kubernetes. Knowledge of microservices architecture. Familiarity with CI/CD pipelines and DevOps practices. Excellent communication skills Ability to work effectively in a fast-paced, collaborative environment.
Posted 1 day ago
3.0 years
0 Lacs
Andhra Pradesh, India
On-site
Key Responsibilities Provide L1/ L1.5 support for applications developed in .Net. Troubleshoot and resolve application issues, ensuring minimal disruption to business operations. Monitor application performance and system health, identifying and addressing potential issues proactively. Collaborate with development teams, and other stakeholders to resolve complex technical problems. Perform root cause analysis for recurring issues and implement permanent solutions. Develop and maintain documentation for troubleshooting steps, issue resolution, and standard operating procedures. Assist in the deployment and configuration of applications and updates in the Azure environment. Execute SQL queries to retrieve data, diagnose issues, and provide insights for application support. Participate in on-call rotation to provide 24/7 support for critical applications. Communicate effectively with users, providing updates on issue resolution and system status. Proven experience in application support, specifically with .Net, Azure, and SQL. Strong understanding of .Net framework and C# programming. Proficiency in SQL, with the ability to write and troubleshoot complex queries. Excellent problem-solving skills and the ability to work under pressure. Strong communication skills, both written and verbal. Ability to work independently and as part of a team. Willingness to participate in on-call support and working in shifts. Preferred Skills 3+ years related experience Knowledge of ITIL framework and incident management processes. Familiarity with automation tools and techniques to streamline support processes. Experience in supporting insurance domain applications.
Posted 1 day ago
12.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Job Description Automation Title Data Architect Type of Employment Permanent Overall Years Of Experience 12-15 years Relevant Years Of Experience 10+ Data Architect Data Architect is responsible for designing and implementing data architecture for multiple projects and also build strategies for data governance Position Summary 12 – 15 yrs of experience in a similar profile with strong service delivery background Experience as a Data Architect with a focus on Spark and Data Lake technologies. Experience in Azure Synapse Analytics Proficiency in Apache Spark for large-scale data processing. Expertise in Databricks, Delta Lake, Azure data factory, and other cloud-based data services. Strong understanding of data modeling, ETL processes, and data warehousing principles. Implement a data governance framework with Unity Catalog . Knowledge in designing scalable streaming data pipeline using Azure Event Hub, Azure Stream analytics, Spark streaming Experience with SQL and NoSQL databases, as well as familiarity with big data file formats like Parquet and Avro. Hands on Experience in python and relevant libraries such as pyspark, numpy etc Knowledge of Machine Learning pipelines, GenAI, LLM will be plus Excellent analytical, problem-solving, and technical leadership skills. Experience in integration with business intelligence tools such as Power BI Effective communication and collaboration abilities Excellent interpersonal skills and a collaborative management style Own and delegate responsibilities effectively Ability to analyse and suggest solutions Strong command on verbal and written English language Essential Roles and Responsibilities Work as a Data Architect and able to design and implement data architecture for projects having complex data such as Big Data, Data lakes etc Work with the customers to define strategy for data architecture and data governance Guide the team to implement solutions around data engineering Proactively identify risks and communicate to stakeholders. Develop strategies to mitigate risks Build best practices to enable faster service delivery Build reusable components to reduce cost Build scalable and cost effective architecture EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 1 day ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 1 day ago
4.0 - 9.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior- Insurance Data analyst We are seeking a highly skilled and motivated Data Analyst with experience in ETL services to join our dynamic team. As a Data analyst, you will be responsible for data requirement gathering, preparing data requirement artefacts, preparing data integration strategies, data quality, you will work closely with data engineering teams to ensure seamless data flow across our systems. Key Responsibilities: Expertise in the P&C Insurance domain. Interact with stakeholders, source teams to gather data requirements. Specialized skill in Policy and/or Claims and/or Billing insurance source systems. Thorough understanding of the life cycle of Policy and Claims. Should have good understanding of various transactions involved. Prepare data dictionaries, source to target mapping and understand underlying transformation logic Experience in any of the insurance products including Guidewire and/or Duckcreek Better understanding of Insurance data models including Policy Centre, Claim Centre and Billing Centre Create various data scenarios using the Insurance suite for data team to consume for testing Experience and/or understanding of any Insurance Statutory or Regulatory reports is an add-on Discover, design, and develop analytical methods to support novel approaches of data and information processing Perform data profiling manually or using profiling tools Identify critical data elements and PII handling process/mandates Understand handling process of historic and incremental data loads and generate clear requirements for data integration and processing for the engineering team Perform analysis to assess the quality of the data, determine the meaning of the data, and provide data facts and insights Interface and communicate with the onsite teams directly to understand the requirement and determine the optimum data intake process Responsible for creating the HLD/LLD to enable data engineering team to work on the build Provide product and design level functional and technical expertise along with best practices Required Skills and Qualifications: BE/BTech/MTech/MCA with 4 - 9 years of industry experience with data analysis, management and related data service offerings Experience in Insurance domains Strong analytical skills Strong SQL experience Good To have: Experience using Agile methodologies Experience using cloud technologies such as AWS or Azure Other Key capabilities: Client facing skills and proven ability in effective planning, executing and problem-solving Excellent communication, inter-personal, and teamworking skills Multi-tasking attitude, flexible with ability to change priorities quickly Methodical approach, logical thinking, and ability to plan work and meet deadlines Accuracy and attention to details Written and verbal communication skills Ability to plan resource requirements from high level specifications EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough