Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 years
5 - 7 Lacs
Delhi, India
On-site
Urgent Hiring for a Reputed Hospital in Oman Location : Oman Industry : Healthcare / Hospital Employment Type : Full-Time | Overseas Opportunity A reputed and well-established hospital in Oman is inviting applications from qualified and experienced healthcare professionals for immediate hiring. This is an excellent opportunity for nursing professionals seeking overseas placement with career growth, competitive salary, and attractive benefits. Nursing Assistants (M) Qualification: GNM or B.Sc. Nursing Experience: Minimum 3 years Mandatory: Positive Dataflow Report Salary: Up to OMR 300 Job Role : Assist registered nurses in delivering basic clinical care, maintaining hygiene, assisting in mobility, and monitoring patient vitals under supervision. Healthcare Assistant (M) Qualification: GNM or B.Sc. Nursing Experience: Minimum 3 years Mandatory: Positive Dataflow Report Salary: Up to OMR 275 Job Role : Provide support in daily patient care, record observations, and ensure hygiene and comfort of patients in the assigned wards or units. Nursing Assistants (F) Qualification: ANM (2-year course) from a State Council recognized institute Experience: Minimum 3 years Mandatory: Positive Dataflow Report Salary: Up to OMR 250 Job Role : Assist nursing staff with patient care activities, hygiene support, and basic monitoring under supervision in clinical and non-clinical settings. Healthcare Assistant (F) Qualification: ANM (1-year course) from a State Council recognized institute Experience: Minimum 3 years Mandatory: Positive Dataflow Report Salary: Up to OMR 230 Job Role : Help patients with mobility, maintain cleanliness, support nurses in non-clinical care, and ensure a safe and comforting environment for patients. Employee Benefits Free Joining Ticket (reimbursed after successful completion of 3-month probation) 30 Days Paid Annual Leave (after completion of 1 year of service) Yearly Round-Trip Air Ticket Medical Insurance Life Insurance Accommodation Provided (chargeable up to OMR 20/month) Additional Requirements Age preferably below 38 years Must hold a Positive Dataflow Report Excellent interpersonal and patient-handling skills Willingness to relocate and work flexible shifts in Oman All documents must be ready for licensing and visa processing How To Apply Interested candidates are requested to send the following documents: Updated CV Passport Copy Positive Dataflow Report Shortlisted candidates will be contacted for further interview and documentation process. Skills: vital signs monitoring,clinical support,mobility assistance,patient-handling skills,healthcare,hygiene maintenance,interpersonal skills,patient care,assistants,nurses
Posted 3 weeks ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Description We are looking for a highly skilled Engineer with a solid experience of building Bigdata, GCP Cloud based real time data pipelines and REST APIs with Java frameworks. The Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organization s data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders. This role will be focused on the delivery of innovative solutions to satisfy the needs of our business. As an agile team we work closely with our business partners to understand what they require, and we strive to continuously improve as a team. Technical Skills 1. Core Data Engineering Skills Proficiency in using GCP s big data tools like BigQuery For data warehousing and SQL analytics. Dataproc: For running Spark and Hadoop clusters. GCP Dataflow For stream and batch data processing.(High level Idea) GCP Pub/Sub: For real-time messaging and event ingestion.(High level Idea) Expertise in building automated, scalable, and reliable pipelines using custom Python/Scala solutions or Cloud Data Functions . Programming and Scripting Strong coding skills in SQL, and Java. Familiarity with APIs and SDKs for GCP services to build custom data solutions. Cloud Infrastructure Understanding of GCP services such as Cloud Storage, Compute Engine, and Cloud Functions. Familiarity with Kubernetes (GKE) and containerization for deploying data pipelines. (Optional but Good to have) DevOps and CI/CD Experience setting up CI/CD pipelines using Cloud Build, GitHub Actions, or other tools. Monitoring and logging tools like Cloud Monitoring and Cloud Logging for production workflows. Backend Development (Spring Boot & Java) Design and develop RESTful APIs and microservices using Spring Boot. Implement business logic, security, authentication (JWT/OAuth), and database operations. Work with relational databases (MySQL, PostgreSQL, MongoDB, Cloud SQL). Optimize backend performance, scalability, and maintainability. Implement unit testing and integration testing Big Data ETL - Datawarehousing GCP Java RESTAPI CI/CD Kubernetes
Posted 3 weeks ago
5.0 years
0 Lacs
Gurgaon, Haryana, India
Remote
Job Title: Senior Data Engineer – Multi-Cloud (AWS, Azure, GCP) Location: Gurgaon, Haryana (Hybrid/Remote options available) Experience: 5+ years Employment Type: Full-time About The Role We are seeking a highly skilled and motivated Senior Data Engineer with hands-on experience across AWS, Azure, and GCP data ecosystems. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and architectures that support advanced analytics and real-time data processing. Key Responsibilities Technical Responsibilities Data Pipeline Development : Design and implement robust ETL/ELT pipelines using cloud-native tools. Cloud Expertise : AWS : EMR, Kinesis, Redshift, Glue Azure : HDInsight, Synapse Analytics, Stream Analytics GCP : Cloud Dataproc, Dataflow, Composer Data Modeling : Develop and optimize data models for analytics and reporting. Data Governance : Ensure data quality, security, and compliance across platforms. Automation & Orchestration : Use tools like Apache Airflow, AWS Step Functions, and GCP Composer for workflow orchestration. Monitoring & Optimization : Implement monitoring, logging, and performance tuning for data pipelines. Collaboration & Communication Work closely with data scientists, analysts, and business stakeholders to understand data needs. Translate business requirements into scalable technical solutions. Participate in code reviews, architecture discussions, and agile ceremonies. Required Qualifications Technical Skills Strong programming skills in Python, SQL, and optionally Scala or Java. Deep understanding of distributed computing, data warehousing, and stream processing. Experience with data lake architectures, data mesh, and real-time analytics. Proficiency in CI/CD practices and infrastructure as code (e.g., Terraform, CloudFormation). Certifications (Preferred) AWS Certified Data Analytics – Specialty Microsoft Certified: Azure Data Engineer Associate Google Professional Data Engineer Soft Skills & Attributes Analytical Thinking : Ability to break down complex problems and design scalable solutions. Communication : Strong verbal and written communication skills to explain technical concepts to non-technical stakeholders. Collaboration : Team player with a proactive attitude and the ability to work in cross-functional teams. Adaptability : Comfortable working in a fast-paced, evolving environment with shifting priorities. Ownership : High sense of accountability and a drive to deliver high-quality solutions.
Posted 3 weeks ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Major skillset: GCP , Pyspark , SQL , Python , Cloud Architecture , ETL , Automation 4+ years of Experience in Data Engineering, Management with strong focus on Spark for building production ready Data Pipelines. Experienced with analyzing large data sets from multiple data sources and build automated testing and validations. Knowledge of Hadoop eco-system and components like HDFS, Spark, Hive, Sqoop Strong Python experience Hands on SQL, HQL to write optimized queries Strong hands-on experience with GCP Big Query, Data Proc, Airflow DAG, Dataflow, GCS, Pub/sub, Secret Manager, Cloud Functions, Beams. Ability to work in fast passed collaborative environment work with various stakeholders to define strategic optimization initiatives. Deep understanding of distributed computing, memory turning and spark optimization. Familiar with CI/CD workflows, Git. Experience in designing modular, automated, and secure ETL frameworks.
Posted 3 weeks ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Major skillset: GCP , Pyspark , SQL , Python , Cloud Architecture , ETL , Automation 4+ years of Experience in Data Engineering, Management with strong focus on Spark for building production ready Data Pipelines. Experienced with analyzing large data sets from multiple data sources and build automated testing and validations. Knowledge of Hadoop eco-system and components like HDFS, Spark, Hive, Sqoop Strong Python experience Hands on SQL, HQL to write optimized queries Strong hands-on experience with GCP Big Query, Data Proc, Airflow DAG, Dataflow, GCS, Pub/sub, Secret Manager, Cloud Functions, Beams. Ability to work in fast passed collaborative environment work with various stakeholders to define strategic optimization initiatives. Deep understanding of distributed computing, memory turning and spark optimization. Familiar with CI/CD workflows, Git. Experience in designing modular, automated, and secure ETL frameworks.
Posted 3 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a skilled Data Engineer with over 5+ years of experience to design, build, and maintain scalable data pipelines and perform advanced data analysis to support business intelligence and data-driven decision-making. The ideal candidate will have a strong foundation in computer science principles, extensive experience with SQL and big data tools, and proficiency in cloud platforms and data visualization tools. Key Responsibilities: Design, develop, and maintain robust, scalable ETL pipelines using Apache Airflow, DBT, Composer (GCP), Control-M, Cron, Luigi, and similar tools. Build and optimize data architectures including data lakes and data warehouses. Integrate data from multiple sources ensuring data quality and consistency. Collaborate with data scientists, analysts, and stakeholders to translate business requirements into technical solutions. Analyze complex datasets to identify trends, generate actionable insights, and support decision-making. Develop and maintain dashboards and reports using Tableau, Power BI, and Jupyter Notebooks for visualization and pipeline validation. Manage and optimize relational and NoSQL databases such as MySQL, PostgreSQL, Oracle, MongoDB, and DynamoDB. Work with big data tools and frameworks including Hadoop, Spark, Hive, Kafka, Informatica, Talend, SSIS, and Dataflow. Utilize cloud data services and warehouses like AWS Glue, GCP Dataflow, Azure Data Factory, Snowflake, Redshift, and BigQuery. Support CI/CD pipelines and DevOps workflows using Git, Docker, Terraform, and related tools. Ensure data governance, security, and compliance standards are met. Participate in Agile and DevOps processes to enhance data engineering workflows. Required Qualifications: 5+ years of professional experience in data engineering and data analysis roles. Strong proficiency in SQL and experience with database management systems such as MySQL, PostgreSQL, Oracle, and MongoDB. Hands-on experience with big data tools like Hadoop and Apache Spark. Proficient in Python programming. Experience with data visualization tools such as Tableau, Power BI, and Jupyter Notebooks. Proven ability to design, build, and maintain scalable ETL pipelines using tools like Apache Airflow, DBT, Composer (GCP), Control-M, Cron, and Luigi. Familiarity with data engineering tools including Hive, Kafka, Informatica, Talend, SSIS, and Dataflow. Experience working with cloud data warehouses and services (Snowflake, Redshift, BigQuery, AWS Glue, GCP Dataflow, Azure Data Factory). Understanding of data modeling concepts and data lake/data warehouse architectures. Experience supporting CI/CD practices with Git, Docker, Terraform, and DevOps workflows. Knowledge of both relational and NoSQL databases, including PostgreSQL, BigQuery, MongoDB, and DynamoDB. Exposure to Agile and DevOps methodologies. Experience with Amazon Web Services (S3, Glue, Redshift, Lambda, Athena) Preferred Skills: Strong problem-solving and communication skills. Ability to work independently and collaboratively in a team environment. Experience with service development, REST APIs, and automation testing is a plus. Familiarity with version control systems and workflow automation.
Posted 3 weeks ago
5.0 years
8 - 10 Lacs
Thiruvananthapuram
On-site
5 - 7 Years 1 Opening Trivandrum Role description We are recruiting Data Engineers with strong technical ability who can articulate well to non-tech audience, who will join our team on a permanent basis. Role: The Data Engineer will engage with external Clients and internal customers, understand their needs, and design, build, and maintain data pipelines and infrastructure using Google Cloud Platform (GCP). This will involve the design and implementation of scalable data architectures, ETL processes, and data warehousing solutions on GCP. The role requires expertise in big data technologies, cloud computing, and data integration, as well as the ability to optimize data systems for performance and reliability. This requires a blend of skills including programming, database management, cloud infrastructure, and data pipeline development. Additionally, problem-solving skills, attention to detail, and the ability to work in a fast-paced environment are valuable traits. You will frequently work as part of a scrum team, together with data scientists, ML engineers, and analyst developers, to design and implement robust data infrastructure that supports analytics and machine learning initiatives. Responsibilities: Design, build, and maintain scalable data pipelines and ETL processes using GCP services such as Cloud Dataflow, Cloud Dataproc, and BigQuery. Implement and optimize data storage solutions using GCP technologies like Cloud Storage, Cloud SQL, and Cloud Spanner. Develop and maintain data warehouses and data lakes on GCP, ensuring data quality, accessibility, and security. Collaborate with data scientists and analysts to understand data requirements and provide efficient data access solutions. Implement data governance and security measures to ensure compliance with regulations and best practices. Automate data workflows and implement monitoring and ing systems for data pipelines. Sharing data engineering knowledge with the wider functions and developing reusable data integration patterns and best practices. Skills/Experience: BSc/MSc in Computer Science, Information Systems, or related field, or equivalent work experience. Proven experience (6+ years) as a Data Engineer or similar role, preferably with GCP expertise. Strong proficiency in SQL and experience with NoSQL databases. Expertise in data modeling, ETL processes, and data warehousing concepts. Significant experience with GCP services such as BigQuery, Dataflow, Dataproc, Cloud Storage, and Pub/Sub. Proficiency in at least one programming language (e.g., Python, Java, or Scala) for data pipeline development. Experience with big data technologies such as Hadoop, Spark, and Kafka. Knowledge of data governance, security, and compliance best practices. GCP certifications (e.g., Professional Data Engineer) are highly advantageous. Effective communication skills to collaborate with cross-functional teams and explain technical concepts to non-technical stakeholders. Skills Bigquery,ETL,Data Management,Python About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 3 weeks ago
6.0 years
0 Lacs
Hyderābād
Remote
Core Technology: Machine Learning Level: 6+ Years Primary Skills: Google Vertex AI, Python Secondary Skills: ML Models, Gcp Open Positions: 2 Job Location: Hyderabad Work Mode: Remote Deployment Type: Full Time Job Description Primary Skills Required: Strong understanding of MLOps practices Hands-on experience in deploying and productionizing ML models Proficient in Python Experience with Google Vertex AI Solid knowledge of machine learning algorithms such as: XGBoost Classification models BigQuery ML (BQML) Key Responsibilities: Design, build, and maintain ML infrastructure on GCP using tools such as Vertex AI, GKE, Dataflow, BigQuery, and Cloud Functions. Develop and automate ML pipelines for model training, validation, deployment, and monitoring using tools like Kubeflow Pipelines, TFX, or Vertex AI Pipelines. Work with Data Scientists to productionize ML models and support experimentation workflows. Implement model monitoring and alerting for drift, performance degradation, and data quality issues. Manage and scale containerized ML workloads using Kubernetes (GKE) and Docker. Set up CI/CD workflows for ML using tools like Cloud Build, Bitbucket, Jenkins, or similar. Ensure proper security, versioning, and compliance across the ML lifecycle. Maintain documentation, artifacts, and reusable templates for reproducibility and auditability. Having GCP MLE Certification is Plus. Job Types: Full-time, Permanent, Fresher Schedule: Day shift Morning shift Application Question(s): Do you have the Hands-on experience in deploying and productionizing ML models? Are you proficient in python? Do you have experience with Google Vertex AI? Do you have Solid knowledge of machine learning algorithms such as: XGBoost Classification models BigQuery ML (BQML) ? Work Location: In person
Posted 3 weeks ago
5.0 years
0 Lacs
Delhi, India
Remote
About Apply Digital Apply Digital is a global experience transformation partner. We drive AI-powered change and measurable impact across complex, multi-brand ecosystems. Leveraging expertise that spans across the customer experience lifecycle from strategy, design to engineering and beyond, we enable our clients to modernize their organizations and maximize value for their business and customers. Our 750+ team members have helped transform global companies like Kraft Heinz, NFL, Moderna, Lululemon, Dropbox, Atlassian, A+E Networks, and The Very Group. Apply Digital was founded in 2016 in Vancouver, Canada. In the past nine years, we have grown to ten cities across North America, South America, the UK, Europe, and India. At Apply Digital, we believe in the “ One Team ” approach, where we operate within a ‘pod’ structure. Each pod brings together senior leadership, subject matter experts, and cross-functional skill sets, all working within a common tech and delivery framework. This structure is underpinned by well-oiled scrum and sprint cadences, keeping teams in step to release often and retrospectives to ensure we progress toward the desired outcomes. Wherever we work in the world, we envision Apply Digital as a safe, empowered, respectful and fun community for people, every single day. Together, we work to embody our SHAPE (smart, humble, active, positive, and excellent) values and make Apply Digital a space for our team to connect, grow, and support each other to make a difference. Visit our Careers page to learn how we can unlock your potential. LOCATION: Apply Digital is a hybrid friendly organization with remote options available if needed. The preferred candidate should be based in (or within a location commutable to) the Delhi/NCR region of India , working in hours that have an overlap with the Eastern Standard Timezone (EST). About The Client In your initial role, you will support Kraft Heinz, a global, multi-billion-dollar leader in consumer packaged foods and a valued client of ours for the past three years. Apply Digital has a bold and comprehensive mandate to drive Kraft Heinz’s digital transformation . Through implementable strategies, cutting-edge technology, and data-driven innovation we aim to enhance consumer engagement and maximize business value for Kraft Heinz. Our composable architecture, modern engineering practices, and deep expertise in AI, cloud computing, and customer data solutions have enabled game-changing digital experiences. Our cross-functional team has delivered significant milestones, including the launch of the What's Cooking App, the re-building of 120+ brand sites in over 20 languages, and most recently, the implementation of a robust Customer Data Platform (CDP) designed to drive media effectiveness. Our work has also been recognized internationally and has received multiple awards . While your work will start with supporting Kraft Heinz, you will also have future opportunities to collaborate with the global team on other international brands. THE ROLE: Are you passionate about building scalable data pipelines and optimizing data architectures? Do you thrive in a fast-paced environment where data-driven decision-making and real-time analytics are essential? Are you excited to collaborate with cross-functional teams to design and implement modern cloud-based data solutions? If so, you may be ready to take on the Senior Data Engineer role within our team. As a Senior Data Engineer, you will play a key role in designing, building, and maintaining cloud-native data pipelines and architectures to support our Composable digital platforms. You will collaborate with engineers, product teams, and analytics stakeholders to develop scalable, secure, and high-performance data solutions that power real-time analytics, reporting, and machine learning workloads. This role requires deep expertise in data engineering, cloud technologies (Google Cloud Platform - BigQuery, Lookers preferred), SQL, Python, and data pipeline orchestration tools (Dagster and DBT). WHAT YOU'LL DO: Design, build, and maintain scalable data pipelines and architectures to support analytical and operational workloads. Oversee the end-to-end management of the CDP platform, ensuring seamless integration across websites, mobile apps, CRM, adtech, and analytics systems to maintain data integrity and maximize activation potential. Collaborate with marketing, product, and data teams to enable real-time data activation and personalized customer experiences using unified CDP profiles. Build and maintain robust event instrumentation frameworks across digital properties to ensure accurate and consistent data capture for CDP ingestion and downstream use. Develop and optimize ETL/ELT pipelines, ensuring efficient data extraction, transformation, and loading from various sources. Work closely with backend and platform engineers to integrate data pipelines into cloud-native applications. Manage and optimize cloud data warehouses, primarily BigQuery, ensuring performance, scalability, and cost efficiency. Implement data governance, security, and privacy best practices, ensuring compliance with company policies and regulations. Collaborate with analytics teams to define data models and enable self-service reporting and BI capabilities. Collaborate with analytics teams to define scalable data models, maintain robust documentation (data dictionaries, lineage, metadata), and continuously monitor and optimize pipelines while staying current with evolving data engineering best practices. WHAT WE'RE LOOKING FOR: Strong proficiency in English (written and verbal communication) is required.Experience working with remote teams across North America and Latin America, ensuring smooth collaboration across time zones. 5+ years of experience in data engineering, with expertise in building scalable data pipelines and cloud-native data architectures. Proven hands-on experience implementing and managing CDPs like Twilio Segment (or similar CDPs), including event tracking plans, source/destination configuration, and identity resolution strategies. Deep understanding of MarTech ecosystems and how CDP data integrates with advertising platforms (Meta, Google Ads), CRM tools, and experimentation platforms for personalization and performance measurement. Strong proficiency in SQL for data modeling, transformation, and performance optimization. Experience with BI and data visualization tools (e.g., Looker, Tableau, or Google Data Studio).Expertise in Python for data processing, automation, and pipeline development. Extensive experience with cloud data platforms, especially Google Cloud (BigQuery, Cloud Storage, Pub/Sub), including hands-on implementation of ETL/ELT workflows using tools like DBT, Dataflow, or Apache Beam, and orchestration with Airflow, Dagster, or Cloud Workflows. Understanding of data privacy, security, and compliance best practices.Strong problem-solving skills, with the ability to debug and optimize complex data workflows.Excellent communication and collaboration skills. NICE TO HAVES: Experience with real-time data streaming solutions (e.g., Kafka, Pub/Sub, or Kinesis). Familiarity with machine learning workflows and MLOps best practices. Knowledge of Terraform for Infrastructure as Code (IaC) in data environments. Familiarity with data integrations involving Contentful, Algolia, Segment, and Talon.One . #Promoted LIFE AT APPLY DIGITAL At Apply Digital, people are at the core of everything we do . We value your time, safety, and health, and strive to build a work community that can help you thrive and grow. Here are a few benefits we offer to support you: Location: Apply Digital is a hybrid friendly organization with remote options available if needed. The preferred candidate should be based in (or within a location commutable to) Delhi/NCR, with the ability to overlap with the US/NA times zones when required. Comprehensive Benefits: benefit from private healthcare coverage, contributions to your Provident fund, and a gratuity bonus after five years of service. Vacation policy: work-life balance is key to our team’s success, so we offer flexible personal time offer (PTO); allowing ample time away from work to promote overall well-being. Great projects: broaden your skills on a range of engaging projects with international brands that have a global impact. An inclusive and safe environment: we’re truly committed to building a culture where you are celebrated and everyone feels welcome and safe. Learning opportunities: we offer generous training budgets, including partner tech certifications, custom learning plans, workshops, mentorship, and peer support. Apply Digital is committed to building a culture where differences are celebrated, and everyone feels welcome. That’s why we value equal opportunity and nurture an inclusive workplace where our individual differences are recognized and valued. For more information, visit our website’s Diversity, Equity, and Inclusion (DEI) page. If you have special needs or accommodations at this stage of the recruitment process, please inform us as soon as possible by emailing us at careers@applydigital.com .
Posted 3 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Consultant P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcreek) LOBS Line of Business (Personal and Commercial Lines) : must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Worked on multiple Business transformation, upgrade and modernization programs. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management and communication. Should have end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
🚀 We're Hiring: ETL Developer – GCP Dataflow | 8+ Years Experience 📍 Location : Chennai, India 🕒 Experience : 8+ Years Are you passionate about data engineering and cloud-based ETL solutions? We’re looking for an experienced ETL Developer with expertise in Google Cloud Platform (GCP) – especially Dataflow, BigQuery, and Workflow – to join our growing team in Chennai . 🔍 Key Responsibilities: Design, develop, and support ETL workflows and modules using GCP Dataflow , BigQuery , and Workflow . Participate in and support DevSecOps activities. Execute unit tests , and contribute to code reviews and peer inspections . Perform impact analysis on existing systems for new developments or enhancements. Understand business requirements and deliver scalable, maintainable ETL solutions. Collaborate with cross-functional teams to ensure high-quality delivery. ✅ Requirements: 8+ years of experience in ETL development . Proficiency with GCP Dataflow , BigQuery , and other GCP data services. Strong understanding of data integration , pipelines , and cloud-native architectures. Experience in Agile environments and ability to work in fast-paced development cycles. Strong problem-solving skills and ability to work independently and within a team. 📩 Apply Now or reach out directly Apply now : DM me or send your resume to samdarshi.singh@mwidm.com More info : +91 62392 61536 #ETLDeveloper #ChennaiJobs #GCP #Dataflow #BigQuery #CloudJobs #DataEngineering #NowHiring #TechJobs
Posted 3 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Consultant P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcreek) LOBS Line of Business (Personal and Commercial Lines) : must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Worked on multiple Business transformation, upgrade and modernization programs. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management and communication. Should have end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Consultant P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcreek) LOBS Line of Business (Personal and Commercial Lines) : must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Worked on multiple Business transformation, upgrade and modernization programs. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management and communication. Should have end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 3 weeks ago
8.0 years
0 Lacs
Greater Hyderabad Area
On-site
Area(s) of responsibility Data Engineer – 8 to 10 years Location- All Birlasoft Locations. Job Description Relevant experience in GCP 3+ years Strong experience in data integration with GCP using REST APIs, SQL (preferred Google BigQuery) , Python, GCP (Dataform, Workflow, Dataflow, Cloud Storage, Cloud Data Fusion) Familiarity in developing applications on GCP infrastructure Good to have : Integration of Google My Business, Google Ads & Google Analytics with BigQuery to implement solutions for solving client business problems. Design, develop and deploy data warehouse solutions using different tools, design principles, and conventions. Contribute to project teams and work with all stakeholders responsible for all stages of design and development for complex products and platforms, including solution design, analysis, coding, testing, and integration. Produce high-quality code resulting from knowledge of the latest frameworks, code peer review, and automated unit test scripts. Collaborate, mentor & guide team members for technical support & guidance, perform peer reviews and monitor their development work to meet project requirements. Participate in project design meetings, daily standups and backlog grooming. Develop, document unit tests and execute all processes and procedures in assigned areas of responsibility. Experience in creating Technical Specification and Data Flow document To be able to clearly articulate pros and cons of various technologies and platforms Good to have experience working with Cloud Functions, PubSub. Familiarity working with DevOps tools, deployment, and orchestration technologies. Strong problem-solving and troubleshooting skills. Strong communicator. Highly adaptable in quickly changing technical environments with strong organizational and analytical skills.
Posted 3 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Area(s) of responsibility We’re looking for a GCP Cloud Data & AI Practice Architect to join our team - a practitioner who accelerates outcomes, affects positive change, and moves business forward. Let’s partner. Together, we can accomplish amazing things. Position Overview The GCP Cloud Data & AI Practice Architect possesses 5+ years of experience developing and architecting with GCP solutions and 10+ years in the Machine Learning, Advanced Analytics, Data Engineering and Data Governance. This position is responsible for the development of data and AI solutions utilizing Google Cloud Data services. The candidate must be well versed in multiple cloud platforms and be a motivated technologist with a proven track record of delivering results in software/technology consulting. Responsibilities Technical liaising between customers, development & support teams for Data & AI Lead the design, architecture, and implementation of data pipelines and systems on Google Cloud Platform. This includes leveraging services like BigQuery, Dataflow, Cloud Storage, Vertex AI, Gemini, and others for efficient data ingestion, transformation, and storage. Hands on experience in building proof of concepts (POCs) to actively demonstrate GCP Data and AI stack across several enterprise customers Architect and implement hands Distributed ML, High Volume Data into Google Cloud Participating in architectural discussions to build confidence and ensure customer success when building new and migrating existing applications, software, and services on the GCP platform Conducting deep-dive “hands-on” education/training sessions to transfer knowledge to customers considering, or already using GCP TEKsystems Global Services as the partner of choice for GCP solution implementation Support Sales and Pre-Sales to convert opportunity to revenue through thought leadership Drive consensus across cross functional team to deliver Google Cloud Data & AI engagements Must have skills : GCP , Gen AI , LLM , Big Query, Dataflow, Cloud Storage, Vertex AI,
Posted 3 weeks ago
8.0 years
0 Lacs
Delhi, India
On-site
About Position: We are seeking a highly skilled GCP Data and AI Solution Specialist with over 8 years of experience in pre-sales, solution architecture, and development, specializing in Google Cloud Platform (GCP) technologies. The ideal candidate will hold all relevant Google Cloud certifications and possess expertise in AI agent development, data engineering, and delivering end-to-end cloud-based solutions. This role involves collaborating with cross-functional teams, engaging with clients in pre-sales activities, and designing scalable, innovative data and AI solutions to meet business needs. Role: GCP Data & AI Solution Specialist Location: Delhi/Mumbai/Bangalore Experience: 8+ Yrs. Job Type: Full Time Employment What You'll Do: Solution Design & Architecture: Design and implement scalable, secure, and high-performance data and AI solutions on GCP, leveraging services like BigQuery, Dataflow, Vertex AI, and Cloud AI Platform. Pre-Sales Support: Collaborate with sales teams to develop compelling proposals, presentations, and proof-of-concepts (PoCs) for clients, showcasing GCP’s data and AI capabilities. AI Agent Development: Build and deploy intelligent AI agents using GCP’s AI/ML tools, including natural language processing (NLP), computer vision, and generative AI models. Client Engagement: Act as a trusted advisor to clients, understanding their business requirements and translating them into technical solutions. Development & Implementation: Lead end-to-end development of data pipelines, machine learning models, and AI-driven applications, ensuring alignment with best practices and client expectations. Technical Leadership: Mentor junior team members, provide technical guidance, and stay updated with the latest GCP advancements. Cross-Functional Collaboration: Work with product managers, engineers, and stakeholders to ensure seamless project delivery and integration with existing systems. Optimization & Performance: Optimize data workflows, AI models, and cloud infrastructure for cost-efficiency, scalability, and performance. Compliance & Security: Ensure solutions adhere to industry standards, data privacy regulations, and GCP security best practices. Expertise You'll Bring: Experience: Minimum of 8+ years in pre-sales, solution architecture, and development, with at least 5 years focused on GCP data and AI solutions. Certifications: Must hold relevant Google Cloud certifications, including: Google Cloud Professional Data Engineer . Google Cloud Professional Machine Learning Engineer. Google Cloud Professional Cloud Architect Additional certifications (e.g., Professional Cloud Developer, Professional Cloud DevOps Engineer) are a plus. Technical Skills: Expertise in GCP services: BigQuery, Dataflow, Dataproc, Vertex AI, AI Platform, Cloud Storage, and Pub/Sub. Proficiency in AI/ML frameworks (e.g., TensorFlow, PyTorch) and experience building AI agents (e.g., chatbots, recommendation systems). Strong programming skills in Python, SQL, and Java/Scala for data processing and model development. Experience with data pipeline tools (e.g., Apache Beam, Airflow) and ETL processes. Knowledge of CI/CD pipelines, Kubernetes, and containerization for deploying AI solutions. Pre-Sales Experience : Proven track record in client-facing roles, including creating technical proposals, conducting demos, and leading PoCs. Soft Skills: Excellent communication and presentation skills to articulate complex technical concepts to non-technical stakeholders. Strong problem-solving skills and ability to work in a fast-paced, dynamic environment. Leadership and mentoring capabilities to guide teams and drive project success. Domain Knowledge: Familiarity with industries such as finance, healthcare, retail, or manufacturing is a plus. Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent “Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind.”
Posted 3 weeks ago
0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Consultant P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcreek) LOBS Line of Business (Personal and Commercial Lines) : must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Worked on multiple Business transformation, upgrade and modernization programs. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management and communication. Should have end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Consultant P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcreek) LOBS Line of Business (Personal and Commercial Lines) : must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Worked on multiple Business transformation, upgrade and modernization programs. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management and communication. Should have end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Preferred Education Master's Degree Required Technical And Professional Expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management tools: Dataflow, Pub Sub, Hadoop, spark-streaming… Version control system: GIT & Preferable knowledge of Infrastructure as Code: Terraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred Technical And Professional Experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.
Posted 3 weeks ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Consultant P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcreek) LOBS Line of Business (Personal and Commercial Lines) : must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Worked on multiple Business transformation, upgrade and modernization programs. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management and communication. Should have end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
6.0 - 8.0 years
6 - 9 Lacs
Pune, Bengaluru
Work from Office
Role & responsibilities Bachelors in computer science, Engineering, or equivalent experience 7+ years of experience in core JAVA, Spring Framework (Required) 2 years of Cloud experience (GCP, AWS, Azure, GCP preferred ) (Required) Experience in big data processing, on a distributed system. (required) Experience in databases RDBMS, NoSQL databases Cloud natives. (Required) Experience in handling various data formats like Flat file, jSON, Avro, xml etc with defining the schemas and the contracts. (required) Experience in implementing the data pipeline (ETL) using Dataflow( Apache beam) Experience in Microservices and integration patterns of the APIs with data processing. Experience in data structure, defining and designing the data models
Posted 3 weeks ago
7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Demonstrate a deep understanding of cloud native, distributed micro service based architectures Deliver solutions for complex business problems through software standard SDLC Build strong relationships with both internal and external stakeholders including product, business and sales partners Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Build and manage strong technical teams that deliver complex software solutions that scale Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure Leverage strong experience in full stack software development and public cloud like GCP and AWS Mentor, coach and develop junior and senior software, quality and reliability engineers Lead with a data/metrics driven mindset with a maniacal focus towards optimizing and creating efficient solutions Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Collaborate with architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Drive up-to-date technical documentation including support, end user documentation and run books Lead Sprint planning, Sprint Retrospectives, and other team activity Responsible for implementation architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 7+ years experience with Cloud technology: GCP, AWS, or Azure 7+ years experience designing and developing cloud-native solutions 7+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 7+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Posted 3 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Posted 3 weeks ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Minimum qualifications: Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience. 8 years of experience with software development in one or more programming languages (e.g., Python, C, C++, Java, JavaScript). 3 years of experience in a technical leadership role; overseeing projects, with 2 years of experience in a people management, supervision/team leadership role. Experience in one or more disciplines such as machine learning, recommendation systems, natural language processing, computer vision, pattern recognition, or artificial intelligence. Preferred qualifications: Understanding of agentic AI/ML and Large Language Model (LLM). Excellent coding skills. About The Job Like Google's own ambitions, the work of a Software Engineer goes beyond just Search. Software Engineering Managers have not only the technical expertise to take on and provide technical leadership to major projects, but also manage a team of Engineers. You not only optimize your own code but make sure Engineers are able to optimize theirs. As a Software Engineering Manager you manage your project goals, contribute to product strategy and help develop your team. Teams work all across the company, in areas such as information retrieval, artificial intelligence, natural language processing, distributed computing, large-scale system design, networking, security, data compression, user interface design; the list goes on and is growing every day. Operating with scale and speed, our exceptional software engineers are just getting started -- and as a manager, you guide the way. With technical and leadership expertise, you manage engineers across multiple teams and locations, a large product budget and oversee the deployment of large-scale projects across multiple sites internationally. At Corp Eng, we build world-leading business solutions that scale a more helpful Google for everyone. As Google’s IT organization, we provide end-to-end solutions for organizations across Google. We deliver the right tools, platforms, and experiences for all Googlers as they create more helpful products and services for everyone. In the simplest terms, we are Google for Googlers. Responsibilities Manage a team of AI software engineers, fostering a collaborative and high-performing environment. This includes hiring, mentoring, performance management, and career development. Drive the design, development, and deployment of scalable and reliable Artificial Intelligence/Machine Learning (AI/ML) systems and infrastructure relevant to HR applications (e.g., talent acquisition, performance management, employee engagement, workforce planning). Collaborate with Product Managers and HR stakeholders to understand business needs, define product requirements, and translate them into technical specifications and project plans. Oversee the architecture and implementation of data pipelines using Google's data processing infrastructure (e.g., Beam, Dataflow) to support AI/ML initiatives. Stay up-to-date of the latest advancements in AI/ML and related technologies, evaluating their potential application within human resources and guiding the team's adoption of relevant innovations. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France