Home
Jobs

671 Dataflow Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 years

4 - 7 Lacs

Haryāna

On-site

GlassDoor logo

Job Title: Nursing/Health Care Assistant Location: Oman Employment Type: Full-Time (rotational shifts, weekend availability) Salary: 250 to 300 OMR per month Reports To: RNs / LPNs / Nurse Manager Job Summary We are seeking a compassionate and dedicated Nursing/Health Care Assistant to support our nursing and rehabilitation team in delivering exceptional patient care. Under the supervision of RNs/LPNs, you will assist with daily living activities, monitor vital signs, maintain hygiene and safety, support therapy sessions, manage feeding and incontinence, perform light housekeeping, and assist with admissions, transfers, and transportation. Key Responsibilities 1. Personal Care & Activities of Daily Living Assist patients with bathing, grooming, dressing, toileting, and incontinence care. Support mobility: transfers, ambulation, positioning, turning to prevent bedsores, and range-of-motion exercises. Provide tube feeding and feeding assistance when necessary. 2. Observation & Monitoring Measure and record vital signs (BP, pulse, temperature, respiration) and intake/output per shift. Observe and document changes in behaviour, mood, physical condition, or signs of distress/aggression, and report promptly. Assist in restraining patients as per rehabilitation protocols. 3. Therapeutic Support Aid physiotherapists and participate in group or individual therapy sessions. Escort patients in emergency and non-emergency situations within the facility or to outpatient (OPD) appointments and events. 4. Medical & Equipment Care Support light medical tasks under supervision (e.g., non‑sterile dressings, routine equipment/supply care). Perform inventory checks and ensure medical supplies/equipment are organized and functional. 5. Environment & Safety Ensure patient rooms are clean and hygienic: change linens, sanitize equipment, tidy rooms. Maintain infection control, follow health & safety protocols, and supervise patients to prevent falls or harm. 6. Admissions, Transfers & Documentation Assist with patient admissions, transfers, and discharges. Accurately record care activities, observations, vitals, feeding, and output in patient charts. 7. Emotional & Companionship Support Provide compassionate companionship, basic patient education, and emotional support. Qualifications & Skills ANM diploma (2‑year) or CNA/Healthcare Assistant certification. 1–3 years minimum healthcare or GNM/BSc or relevant qualification; 3+ years preferred. CPR/BLS certification advantageous. Valid Dataflow clearance (for international candidates). Strong interpersonal, communication, empathy, and confidentiality skills. Physically able to lift up to ~50 lbs, stand for long periods, and perform patient transfers. Working Hours & Benefits Schedule : Rotational shifts; weekend availability. Benefits : Free Joining Ticket (Will be reimbursed after the 3 months’ Probation period) 30 Days paid Annual leave after 1 year of service completion Yearly Up and Down Air Ticket Medical Insurance Life Insurance Accommodation (Chargeable up to OMR 20/-) Note: Interested candidates please call us at 97699 11050 or 99302 65888 , or email your CV to recruitment@thegrowthhive.org . Job Type: Full-time Pay: ₹40,000.00 - ₹60,000.00 per month Benefits: Food provided Health insurance Provident Fund Schedule: Monday to Friday Rotational shift Weekend availability Work Location: In person

Posted 20 hours ago

Apply

6.0 years

5 - 9 Lacs

Bengaluru

On-site

GlassDoor logo

Candidates for this position are preferred to be based in Bangalore, India and will be expected to comply with their team's hybrid work schedule requirements. Who We Are: Wayfair runs the largest custom e-commerce large parcel network in the United States, approximately 1.6 million square meters of logistics space. The nature of the network is inherently a highly variable ecosystem that requires flexible, reliable, and resilient systems to operate efficiently. We are looking for a passionate Backend Software Engineer to join the Fulfillment Optimization team. This team builds the platforms that determine how customer orders are fulfilled, optimizing for Wayfair profitability and customer delight. A big part of our work revolves around enhancing and scaling customer-facing platforms that provide fulfillment information on our websites, starting at the top of the customer funnel on the search pages all the way through orders being delivered. Throughout this customer journey, we are responsible for maintaining an accurate representation of our dynamic supply chain, determining how different products will fit into boxes, predicting how these boxes will flow through warehouses and trucks, and ultimately surfacing the information our customers need to inform their decision and the details our suppliers and carriers require to successfully execute on the promises made to our customers. We do all of this in milliseconds, thousands of times per second. About the Role : As a Data Engineer, you will be part of the Data Engineering team with this role being inherently multi-functional, and the ideal candidate will work with Data Scientist, Analysts, Application teams across the company, as well as all other Data Engineering squads at Wayfair. We are looking for someone with a love for data, understanding requirements clearly and the ability to iterate quickly. Successful candidates will have strong engineering skills and communication and a belief that data-driven processes lead to phenomenal products. What you'll do : Build and launch data pipelines, and data products focussed on SMART Org. Helping teams push the boundaries of insights, creating new product features using data, and powering machine learning models. Build cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization. Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructure What You'll Need : Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience 6+ years relevant work experience in the Data Engineering field with web scale data sets. Demonstrated strength in data modeling, ETL development and data lake architecture. Data Warehousing Experience with Big Data Technologies (Hadoop, Spark, Hive, Presto, Airflow etc.). Coding proficiency in at least one modern programming language (Python, Scala, etc) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing and query performance tuning skills of large data sets. Industry experience as a Big Data Engineer and working along cross functional teams such as Software Engineering, Analytics, Data Science with a track record of manipulating, processing, and extracting value from large datasets. Strong business acumen. Experience leading large-scale data warehousing and analytics projects, including using GCP technologies – Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related big data technologies in other cloud platforms like AWS, Azure etc. Be a team player and introduce/follow the best practices on the data engineering space. Ability to effectively communicate (both written and verbally) technical information and the results of engineering design at all levels of the organization. Good to have : Understanding of NoSQL Database exposure and Pub-Sub architecture setup. Familiarity with Bl tools like Looker, Tableau, AtScale, PowerBI, or any similar tools. About Wayfair Inc. Wayfair is one of the world’s largest online destinations for the home. Whether you work in our global headquarters in Boston, or in our warehouses or offices throughout the world, we’re reinventing the way people shop for their homes. Through our commitment to industry-leading technology and creative problem-solving, we are confident that Wayfair will be home to the most rewarding work of your career. If you’re looking for rapid growth, constant learning, and dynamic challenges, then you’ll find that amazing career opportunities are knocking. No matter who you are, Wayfair is a place you can call home. We’re a community of innovators, risk-takers, and trailblazers who celebrate our differences, and know that our unique perspectives make us stronger, smarter, and well-positioned for success. We value and rely on the collective voices of our employees, customers, community, and suppliers to help guide us as we build a better Wayfair – and world – for all. Every voice, every perspective matters. That’s why we’re proud to be an equal opportunity employer. We do not discriminate on the basis of race, color, ethnicity, ancestry, religion, sex, national origin, sexual orientation, age, citizenship status, marital status, disability, gender identity, gender expression, veteran status, genetic information, or any other legally protected characteristic. Your personal data is processed in accordance with our Candidate Privacy Notice (https://www.wayfair.com/careers/privacy). If you have any questions or wish to exercise your rights under applicable privacy and data protection laws, please contact us at dataprotectionofficer@wayfair.com.

Posted 21 hours ago

Apply

0 years

9 - 9 Lacs

Bengaluru

On-site

GlassDoor logo

Associate - Production Support Engineer Job ID: R0388737 Full/Part-Time: Full-time Regular/Temporary: Regular Listed: 2025-06-27 Location: Bangalore Position Overview Job Title: Associate - Production Support Engineer Location: Bangalore, India Role Description You will be operating within Corporate Bank Production as an Associate, Production Support Engineer in the Corporate Banking subdivisions. You will be accountable to drive a culture of proactive continual improvement into the Production environment through application, user request support, troubleshooting and resolving the errors in production environment. Automation of manual work, monitoring improvements and platform hygiene. Supporting the resolution of issues and conflicts and preparing reports and meetings. Candidate should have experience in all relevant tools used in the Service Operations environment and has specialist expertise in one or more technical domains and ensures that all associated Service Operations stakeholders are provided with an optimum level of service in line with Service Level Agreements (SLAs) / Operating Level Agreements (OLAs). Ensure all the BAU support queries from business are handled on priority and within agreed SLA and also to ensure all application stability issues are well taken care off. Support the resolution of incidents and problems within the team. Assist with the resolution of complex incidents. Ensure that the right problem-solving techniques and processes are applied Embrace a Continuous Service Improvement approach to resolve IT failings, drive efficiencies and remove repetition to streamline support activities, reduce risk, and improve system availability. Be responsible for your own engineering delivery plus, using data and analytics, drive a reduction in technical debt across the production environment with development and infrastructure teams. Act as a Production Engineering role model to enhance the technical capability of the Production Support teams to create a future operating model embedded with engineering culture. Deutsche Bank’s Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What we’ll offer you As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Lead by example to drive a culture of proactive continual improvement into the Production environment through automation of manual work, monitoring improvements and platform hygiene. Carry out technical analysis of the Production platform to identify and remediate performance and resiliency issues. Engage in the Software Development Lifecycle (SDLC) to enhance Production Standards and controls. Update the RUN Book and KEDB as & when required Participate in all BCP and component failure tests based on the run books Understand flow of data through the application infrastructure. It is critical to understand the dataflow to best provide operational support Event monitoring and management via a 24x7 workbench that is both monitoring and regularly probing the service environment and acting on instruction of the run book. Drive knowledge management across the supported applications and ensure full compliance Works with team members to identify areas of focus, where training may improve team performance, and improve incident resolution. Your skills and experience Recent experience of applying technical solutions to improve the stability of production environments Working experience of some of the following technology skills: Technologies/Frameworks: Unix, Shell Scripting and/or Python SQL Stack Oracle 12c/19c - for pl/sql, familiarity with OEM tooling to review AWR reports and parameters ITIL v3 Certified (must) Control-M, CRON scheduling MQ- DBUS, IBM JAVA 8/OpenJDK 11 (at least) - for debugging Familiarity with Spring Boot framework Data Streaming – Kafka (Experience with Confluent flavor a plus) and ZooKeeper Hadoop framework Configuration Mgmt Tooling: Ansible Operating System/Platform: RHEL 7.x (preferred), RHEL6.x OpenShift (as we move towards Cloud computing and the fact that Fabric is dependent on OpenShift) CI/CD: Jenkins (preferred) APM Tooling: either or one of Splunk AppDynamics Geneos NewRelic Other platforms: Scheduling – Ctrl-M is a plus, Autosys, etc Search – Elastic Search and/or Solr+ is a plus Methodology: Micro-services architecture SDLC Agile Fundamental Network topology – TCP, LAN, VPN, GSLB, GTM, etc Familiarity with TDD and/or BDD Distributed systems Experience on cloud platforms such as Azure, GCP is a plus Familiarity with containerization/Kubernetes Tools: ServiceNow Jira Confluence BitBucket and/or GIT IntelliJ SQL Plus Familiarity with simple Unix Tooling – putty, mPutty, exceed (PL/)SQL Developer Good understanding of ITIL Service Management framework such as Incident, Problem, and Change processes. Ability to self-manage a book of work and ensure clear transparency on progress with clear, timely, communication of issues. Excellent communication skills, both written and verbal, with attention to detail. Ability to work in Follow the Sun model, virtual teams and in matrix structure Service Operations experience within a global operations context 6-9 yrs experience in IT in large corporate environments, specifically in the area of controlled production environments or in Financial Services Technology in a client-facing function Global Transaction Banking Experience is a plus. Experience of end-to-end Level 2,3,4 management and good overview of Production/Operations Management overall Experience of run-book execution Experience of supporting complex application and infrastructure domains Good analytical, troubleshooting and problem-solving skills Working knowledge of incident tracking tools (i.e., Remedy, Heat etc.) How we’ll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 21 hours ago

Apply

5.0 years

3 - 9 Lacs

Chennai

On-site

GlassDoor logo

The Industrial System Analytics (ISA) team within GDIA develops cutting-edge cloud analytic solutions using GCP tools and techniques to drive strategic insights across Ford. As a Product Owner (Supervisor), you will be a critical leader within our product-driven organization. You will be responsible for defining, prioritizing, and delivering high-value data products and analytical solutions that directly address key business challenges. This role requires a strong blend of strategic product thinking, hands-on agile execution, and the ability to lead, mentor, and guide your team (or cross-functional teams) to achieve exceptional outcomes in a dynamic, data-intensive environment. You’ll have… Bachelor's degree in a quantitative field such as Computer Science, Engineering, Information Systems, Business Analytics, or a related discipline. 5+ years of experience as a Product Owner, Business Analyst, or similar role managing digital products or data solutions. Demonstrated experience in defining product roadmaps, managing backlogs, and prioritizing features. Proven experience working within an Agile software development environment. Experience gathering and translating business requirements into technical specifications and user stories. Strong understanding of data analytics, AI/ML concepts, and how they can drive business value. Familiarity with cloud platforms, preferably Google Cloud Platform (GCP) services (e.g., BigQuery, GCS, Dataflow). Excellent communication, interpersonal, and stakeholder management skills. Even better, you may have… Master's degree or PhD in a quantitative field. Experience supervising or mentoring other Product Owners or team members. Hands-on experience with data visualization tools (e.g., Tableau, Power BI, Looker). Proficiency in SQL and/or scripting languages (e.g., Python) for data exploration. Knowledge of Ford's internal data ecosystems or IT systems. Experience with DevSecOps practices and tools (e.g., CI/CD pipelines, Jira, GitHub). Certified Scrum Product Owner (CSPO) or similar Agile certification. Proven ability to balance "doing it right" with "speed to delivery" in a fast-paced environment. Inquisitive, proactive, and interested in learning new tools and techniques. - Product Strategy & Vision: Translate high-level business objectives and customer needs into clear product vision, strategy, and measurable outcomes for your product area. Communicate product vision and strategy effectively to the development team, stakeholders, and leadership, ensuring alignment and buy-in. Gather and analyze customer/internal feedback to continuously refine the product roadmap and drive improvements. Backlog Management & Prioritization: Own, define, and prioritize the product backlog, ensuring it is well-groomed with clear, actionable user stories and acceptance criteria. Collaborate closely with engineering, data science, and UX teams to refine requirements and ensure technical feasibility and optimal solution design. Manage interdependencies across features and product releases, identifying and proactively mitigating risks to delivery. Stakeholder Collaboration & Communication: Act as the primary liaison between business stakeholders, customers, and the development team, fostering strong relationships. Translate complex technical concepts into understandable business language and vice versa, facilitating effective decision-making. Manage stakeholder expectations and provide regular, transparent updates on product progress, risks, and achievements. Act as a strategic consultant to the business, guiding them towards optimal data-driven solutions rather than just fulfilling requests. Product Delivery & Quality Assurance: Ensure that delivered software and analytical solutions meet desired business outcomes, quality standards, and compliance requirements (e.g., security, legal, Ford policies). Collaborate with the team to define relevant analytics and metrics to track product performance, adoption, and realized business value. Facilitate user acceptance testing and feedback loops to ensure product adoption and satisfaction. Agile Leadership & Process Improvement: Champion Agile software development principles, culture, and best practices within your team and across the organization. Lead and facilitate team ceremonies (e.g., sprint planning, reviews, retrospectives) to ensure efficient and effective delivery. Mentor, coach, and guide team members (including junior Product Owners, if applicable, or cross-functional team members) in product ownership best practices, problem-solving, and continuous improvement. Ensure effective usage of agile tools (e.g., Jira) and derive meaningful insights for continuous improvement of processes and delivery. Drive adoption of DevSecOps and software craftsmanship practices (CI/CD, TDD) where applicable. -

Posted 21 hours ago

Apply

5.0 years

0 Lacs

Udaipur, Rajasthan, India

On-site

Linkedin logo

For quick Response, please fill out the form Job Application Form 34043 - Data Scientist - Senior I - Udaipur https://docs.google.com/forms/d/e/1FAIpQLSeBy7r7b48Yrqz4Ap6-2g_O7BuhIjPhcj-5_3ClsRAkYrQtiA/viewform 3–5 years of experience in Data Engineering or similar roles Strong foundation in cloud-native data infrastructure and scalable architecture design Build and maintain reliable, scalable ETL/ELT pipelines using modern cloud-based tools Design and optimize Data Lakes and Data Warehouses for real-time and batch processing Ingest, transform, and organize large volumes of structured and unstructured data Collaborate with analysts, data scientists, and backend engineers to define data needs Monitor, troubleshoot, and improve pipeline performance, cost-efficiency, and reliability Implement data validation, consistency checks, and quality frameworks Apply data governance best practices and ensure compliance with privacy and security standards Use CI/CD tools to deploy workflows and automate pipeline deployments Automate repetitive tasks using scripting, workflow tools, and scheduling systems Translate business logic into data logic while working cross-functionally Strong in Python and familiar with libraries like pandas and PySpark Hands-on experience with at least one major cloud provider (AWS, Azure, GCP) Experience with ETL tools like AWS Glue, Azure Data Factory, GCP Dataflow, or Apache NiFi Proficient with storage systems like S3, Azure Blob Storage, GCP Cloud Storage, or HDFS Familiar with data warehouses like Redshift, BigQuery, Snowflake, or Synapse Experience with serverless computing like AWS Lambda, Azure Functions, or GCP Cloud Functions Familiar with data streaming tools like Kafka, Kinesis, Pub/Sub, or Event Hubs Proficient in SQL, and knowledge of relational (PostgreSQL, MySQL) and NoSQL (MongoDB, DynamoDB) databases Familiar with big data frameworks like Hadoop or Apache Spark Experience with orchestration tools like Apache Airflow, Prefect, GCP Workflows, or ADF Pipelines Familiarity with CI/CD tools like GitLab CI, Jenkins, Azure DevOps Proficient with Git, GitHub, or GitLab workflows Strong communication, collaboration, and problem-solving mindset Experience with data observability or monitoring tools (bonus points) Contributions to internal data platform development (bonus points) Comfort working in data mesh or distributed data ownership environments (bonus points) Experience building data validation pipelines with Great Expectations or similar tools (bonus points)

Posted 22 hours ago

Apply

30.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About Client Our client is a market-leading company with over 30 years of experience in the industry. As one of the world’s leading professional services firms, with $19.7B, with 333,640 associates worldwide, helping their clients modernize technology, reimagine processes, and transform experiences, enabling them to remain competitive in our fast-paced world. Their Specialties in Intelligent Process Automation, Digital Engineering, Industry & Platform Solutions, Internet of Things, Artificial Intelligence, Cloud, Data, Healthcare, Banking, Finance, Fintech, Manufacturing, Retail, Technology, and Salesforce. Job Title : GCP Data Engineer. Key Skills : - GCP; Big Data; ETL - Big Data / Data Warehousing,Big query, data proc, data flow, composer Locations : Gurugram Experience : - 7-9 Years Education Qualification : Any Graduation Work Mode : Hybrid Employment Type : Contract to Hire Notice Period : Immediate - 10 Days. Job Description : BigQuery,Cloud Storage, Cloud Pub/Sub, Dataflow, Dataproc,Composer• 6+ years in cloud infrastructure and designing data pipeline, specifically in GCP• Proficiency in programming languages Python, SQL• Proven experience in designing and implementing cloud-native applications and microservices on GCP. •Hands-on experience with CI/CD tools like Jenkins and Github Action•In-depth understanding of GCP networking, IAM policies, and security best practices.

Posted 1 day ago

Apply

5.5 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Experience : 5.5+years Location : Pan India Immediate Joiners Preferred Role Overview : We are looking for a skilled and experienced GCP Data Engineer to join our team. The ideal candidate will have strong expertise in Google Cloud Platform (GCP), Python, PySpark, and SQL, and will be responsible for building scalable data pipelines and solutions in a cloud-native environment. Key Responsibilities Design, develop, and maintain data pipelines and ETL processes on GCP Write efficient and optimized code using Python, PySpark, and SQL Integrate and transform data from multiple sources into GCP-based data warehouses or data lakes Optimize data pipelines for performance, scalability, and cost-efficiency Collaborate with data analysts, data scientists, and business stakeholders to understand data needs Ensure data quality, reliability, and compliance with security standards Required Skills 5+ years of experience as a Data Engineer Strong hands-on experience with Google Cloud Platform (GCP) and its services (e.g., BigQuery, Dataflow, Pub/Sub, Cloud Storage) Proficient in Python and PySpark for data engineering tasks Advanced knowledge of SQL for data manipulation and analytics Experience with data modeling, data warehousing, and big data processing Familiarity with CI/CD tools and agile development practices Preferred Qualifications GCP certification (e.g., Professional Data Engineer) is a plus Experience with Airflow, Terraform, or other orchestration/automation tools Exposure to real-time data processing and streaming (Kafka, Pub/Sub, etc.) (ref:hirist.tech)

Posted 1 day ago

Apply

3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

About the job: What makes Techjays an inspiring place to work At Techjays, we are driving the future of artificial intelligence with a bold mission to empower businesses worldwide by helping them build AI solutions that transform industries. As an established leader in the AI space, we combine deep expertise with a collaborative, agile approach to deliver impactful technology that drives meaningful change. Our global team consists of professionals who have honed their skills at leading companies such as Google, Akamai, NetApp, ADP, Cognizant Consulting, and Capgemini. With engineering teams across the globe, we deliver tailored AI software and services to clients ranging from startups to large-scale enterprises. Be part of a company that’s pushing the boundaries of digital transformation. At Techjays, you’ll work on exciting projects that redefine industries, innovate with the latest technologies, and contribute to solutions that make a real-world impact. Join us on our journey to shape the future with AI. We’re looking for a skilled Data Analytics Engineer to help us build scalable, data-driven solutions that support real-time decision-making and deep business insights. You’ll play a key role in designing and delivering analytics systems that leverage Power BI , Snowflake , and SQL , helping teams across the organization make data-informed decisions with confidence. Experience : 3 to 8 Years Primary Skills: Power BI / Tableau, SQL, Data Modeling, Data Warehousing, ETL/ELT Pipelines, AWS Glue, AWS Redshift, GCP BigQuery, Azure Data Factory, Cloud Data Pipelines, DAX, Data Visualization, Dashboard Development Secondary Skills: Python, dbt, Apache Airflow, Git, CI/CD, DevOps for Data, Snowflake, Azure Synapse, Data Governance, Data Lineage, Apache Beam, Data Catalogs, Basic Machine Learning Concepts Job Location: Coimbatore Key Responsibilities : Develop and maintain scalable, robust ETL/ELT data pipelines across structured and semi-structured data sources. Collaborate with data scientists, analysts, and business stakeholders to identify data requirements and transform them into efficient data models. Design and deliver interactive dashboards and reports using Power BI and Tableau. Implement data quality checks, lineage tracking, and monitoring solutions to ensure high reliability of data pipelines. Optimize SQL queries and BI reports for performance and scalability. Work with cloud-native tools in AWS (e.g., Glue, Redshift, S3), or GCP (e.g., BigQuery, Dataflow), or Azure (e.g., Data Factory, Synapse). Automate data integration and visualization workflows. Required Qualifications: Bachelor's or Master’s degree in Computer Science, Information Systems, Data Science, or a related field. 3+ years of experience in data engineering or data analytics roles. Proven experience with Power BI or Tableau – including dashboard design, DAX, calculated fields, and data blending. Proficiency in SQL and experience in data modeling and relational database design. Hands-on experience with data pipelines and orchestration using tools like Airflow, dbt, Apache Beam, or native cloud tools. Experience working with one or more cloud platforms – AWS, GCP, or Azure. Strong understanding of data warehousing concepts and tools such as Snowflake, BigQuery, Redshift, or Synapse. Preferred Skills: Experience with scripting in Python or Java for data processing. Familiarity with Git, CI/CD, and DevOps for data pipelines. Exposure to data governance, lineage, and catalog tools. Basic understanding of ML pipelines or advanced analytics is a plus. Soft Skills: Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Detail-oriented with a proactive approach to troubleshooting and optimization. What we offer: Best in packages Paid holidays and flexible paid time away Casual dress code & flexible working environment Medical Insurance covering self & family up to 4 lakhs per person. Work in an engaging, fast-paced environment with ample opportunities for professional development. Diverse and multicultural work environment Be part of an innovation-driven culture that provides the support and resources needed to succeed.

Posted 1 day ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position Overview Job Title: Service Operations - Production Engineer Support, AVP Location: Pune, India Role Description You will be operating within Corporate Bank Production domain or in Corporate Banking subdivisions, as a AVP - Production Support Engineer. In this role, you will be accountable to drive a culture of proactive continual improvement into the Production environment through application, user request support, troubleshooting and resolving the errors in production environment. Automation of manual work, monitoring improvements and platform hygiene. Training and Mentoring new and existing team members, supporting the resolution of issues and conflicts and preparing reports and meetings. Candidate should have experience in all relevant tools used in the Service Operations environment and has specialist expertise in one or more technical domains and ensures that all associated Service Operations stakeholders are provided with an optimum level of service in line with Service Level Agreements (SLAs) / Operating Level Agreements (OLAs). Ensure all the BAU support queries from business are handled on priority and within agreed SLA and also to ensure all application stability issues are well taken care off. Support the resolution of incidents and problems within the team. Assist with the resolution of complex incidents. Ensure that the right problem-solving techniques and processes are applied Embrace a Continuous Service Improvement approach to resolve IT failings, drive efficiencies and remove repetition to streamline support activities, reduce risk, and improve system availability. Be responsible for your own engineering delivery and using data and analytics, drive a reduction in technical debt across the production environment with development and infrastructure teams. Act as a Production Engineering role model to enhance the technical capability of the Production Support teams to create a future operating model embedded with engineering culture. Train and Mentor team members to grow to the next role Bring in the culture of innovation engineering and automation mindset Deutsche Bank’s Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Lead by example to drive a culture of proactive continual improvement into the Production environment through automation of manual work, monitoring improvements and platform hygiene. Carry out technical analysis of the Production platform to identify and remediate performance and resiliency issues. Engage in the Software Development Lifecycle (SDLC) to enhance Production Standards and controls. Update the RUN Book and KEDB as & when required Participate in all BCP and component failure tests based on the run books Understand flow of data through the application infrastructure. It is critical to understand the dataflow so as to best provide operational support Event monitoring and management via a 24x7 workbench that is both monitoring and regularly probing the service environment and acting on instruction of a run book. Drive knowledge management across the supported applications and ensure full compliance. Works with team members to identify areas of focus, where training may improve team performance, and improve incident resolution. Your Skills And Experience Recent experience of applying technical solutions to improve the stability of production environments Working experience of some of the following technology skills: Technologies/Frameworks: Shell Scripting and/or Python JAVA 8/OpenJDK 11 (at least) - for debugging Familiarity with Spring Boot framework Unix Troubleshooting skills Hadoop framework stack Oracle 12c/19c - for pl/sql, familiarity with OEM tooling to review AWR reports and parameters No-SQL MQ Knowledge ITIL v3 Certified (must) Configuration Mgmt Tooling : Ansible Operating System/Platform: RHEL 7.x (preferred), RHEL6.x OpenShift (as we move towards Cloud computing and the fact that Fabric is dependent on OpenShift) CI/CD: Jenkins (preferred) Team City APM Tooling: Splunk Geneos NewRelic Prometheus-Grafana Other platforms: Scheduling – Ctrl-M is a plus, AIRFLOW, CRONTAB or Autosys, etc Methodology: Micro-services architecture SDLC Agile Fundamental Network topology – TCP, LAN, VPN, GSLB, GTM, etc Distributed systems experience on cloud platforms such as Azure, GCP is a plus familiarity with containerization/Kubernetes Tools: ServiceNow Jira Confluence BitBucket and/or GIT Oracle, SQL Plus Familiarity with simple Unix Tooling – putty, mPutty, exceed (PL/)SQL Developer Good understanding of ITIL Service Management framework such as Incident, Problem, and Change processes. Ability to self-manage a book of work and ensure clear transparency on progress with clear, timely, communication of issues. Excellent troubleshooting and problem solving skills. Excellent communication skills, both written and verbal, with attention to detail. Ability to work in virtual teams and in matrix structures Experience | Exposure (Recommended): 11+ yrs experience in IT in large corporate environments, specifically in the area of controlled production environments or in Financial Services Technology in a client-facing function Service Operations, development experience within a global operations context Global Transaction Banking Experience is a plus. Experience of end-to-end Level 2,3,4 management and good overview of Production/Operations Management overall Experience of supporting complex application and infrastructure domains ITIL / best practice service context. ITIL foundation is plus. Good analytical and problem-solving skills Added advantage if knowing following technologies. ETL Flow and Pipelines. Knowledge of Bigdata, SPARK, HIVE etc. Hands on exp on Splunk/New Relic for creating dashboards along with alerts/rules setups Understanding of messaging systems like SWIFT. MQ messages Understanding Trade life cycles specially for back office How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 day ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Position Overview Job Title: Associate - Production Support Engineer Location: Bangalore, India Role Description You will be operating within Corporate Bank Production as an Associate, Production Support Engineer in the Corporate Banking subdivisions. You will be accountable to drive a culture of proactive continual improvement into the Production environment through application, user request support, troubleshooting and resolving the errors in production environment. Automation of manual work, monitoring improvements and platform hygiene. Supporting the resolution of issues and conflicts and preparing reports and meetings. Candidate should have experience in all relevant tools used in the Service Operations environment and has specialist expertise in one or more technical domains and ensures that all associated Service Operations stakeholders are provided with an optimum level of service in line with Service Level Agreements (SLAs) / Operating Level Agreements (OLAs). Ensure all the BAU support queries from business are handled on priority and within agreed SLA and also to ensure all application stability issues are well taken care off. Support the resolution of incidents and problems within the team. Assist with the resolution of complex incidents. Ensure that the right problem-solving techniques and processes are applied Embrace a Continuous Service Improvement approach to resolve IT failings, drive efficiencies and remove repetition to streamline support activities, reduce risk, and improve system availability. Be responsible for your own engineering delivery plus, using data and analytics, drive a reduction in technical debt across the production environment with development and infrastructure teams. Act as a Production Engineering role model to enhance the technical capability of the Production Support teams to create a future operating model embedded with engineering culture. Deutsche Bank’s Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Lead by example to drive a culture of proactive continual improvement into the Production environment through automation of manual work, monitoring improvements and platform hygiene. Carry out technical analysis of the Production platform to identify and remediate performance and resiliency issues. Engage in the Software Development Lifecycle (SDLC) to enhance Production Standards and controls. Update the RUN Book and KEDB as & when required Participate in all BCP and component failure tests based on the run books Understand flow of data through the application infrastructure. It is critical to understand the dataflow to best provide operational support Event monitoring and management via a 24x7 workbench that is both monitoring and regularly probing the service environment and acting on instruction of the run book. Drive knowledge management across the supported applications and ensure full compliance Works with team members to identify areas of focus, where training may improve team performance, and improve incident resolution. Your Skills And Experience Recent experience of applying technical solutions to improve the stability of production environments Working experience of some of the following technology skills: Technologies/Frameworks: Unix, Shell Scripting and/or Python SQL Stack Oracle 12c/19c - for pl/sql, familiarity with OEM tooling to review AWR reports and parameters ITIL v3 Certified (must) Control-M, CRON scheduling MQ- DBUS, IBM JAVA 8/OpenJDK 11 (at least) - for debugging Familiarity with Spring Boot framework Data Streaming – Kafka (Experience with Confluent flavor a plus) and ZooKeeper Hadoop framework Configuration Mgmt Tooling: Ansible Operating System/Platform: RHEL 7.x (preferred), RHEL6.x OpenShift (as we move towards Cloud computing and the fact that Fabric is dependent on OpenShift) CI/CD: Jenkins (preferred) APM Tooling: either or one of Splunk AppDynamics Geneos NewRelic Other platforms: Scheduling – Ctrl-M is a plus, Autosys, etc Search – Elastic Search and/or Solr+ is a plus Methodology: Micro-services architecture SDLC Agile Fundamental Network topology – TCP, LAN, VPN, GSLB, GTM, etc Familiarity with TDD and/or BDD Distributed systems Experience on cloud platforms such as Azure, GCP is a plus Familiarity with containerization/Kubernetes Tools: ServiceNow Jira Confluence BitBucket and/or GIT IntelliJ SQL Plus Familiarity with simple Unix Tooling – putty, mPutty, exceed (PL/)SQL Developer Good understanding of ITIL Service Management framework such as Incident, Problem, and Change processes. Ability to self-manage a book of work and ensure clear transparency on progress with clear, timely, communication of issues. Excellent communication skills, both written and verbal, with attention to detail. Ability to work in Follow the Sun model, virtual teams and in matrix structure Service Operations experience within a global operations context 6-9 yrs experience in IT in large corporate environments, specifically in the area of controlled production environments or in Financial Services Technology in a client-facing function Global Transaction Banking Experience is a plus. Experience of end-to-end Level 2,3,4 management and good overview of Production/Operations Management overall Experience of run-book execution Experience of supporting complex application and infrastructure domains Good analytical, troubleshooting and problem-solving skills Working knowledge of incident tracking tools (i.e., Remedy, Heat etc.) How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 day ago

Apply

5.0 years

0 Lacs

India

Remote

Linkedin logo

KLDiscovery, a leading global provider of electronic discovery, information governance and data recovery services, is currently seeking a Senior Software Engineer, C++ for an exciting new opportunity. The position will assist in review and analysis of applications, product development, and enhancements including documentation, code development, and unit testing of releases while adhering to KLDiscovery development processes and workflows with supervision and direction from lead developers and superiors. If you like working in a creative, technology-driven, high energy, collaborative, casual environment, and you have strong software development abilities, this is the opportunity for you! Hybrid or remote, work from home opportunity. Responsibilities Create, Validate and Review program code per specifications. Develop automated unit and API tests. Support bug fixes and implement enhancements to applications in Production. Create, design and review SW documentation. Utilize, communicate, and enforce coding standards. Provide Technical Support to applications in Production within defined SLA. Adhere to development processes and workflows. Assist and mentor team demonstrating technical excellence. Detects problems and areas that need improvement early and raises issues. Qualifications Fluent English (C1) At least 5 years of commercial, hands-on software development experience in C#/.NET and C++ Experience with ASP.NET Core Blazor Experience with Entity Framework Core Experience with desktop applications (Winforms preferred) Experience with background jobs and workers (e.g. Hangfire) Experience with Angular is a plus Creating dataflow/sequence/C4 diagrams Good understanding of at least one of architectural/design patterns: MVC/MVP/MVVM/Clean/Screaming/Hexagonal architectures .NET memory model and performance optimizations solutions Writing functional tests. Writing structure tests. Understanding modularity and vertical slices. Data privacy and securing desktop apps. Ability to design functionalities based on requirements Our Cultural Values Entrepreneurs at heart, we are a customer first team sharing one goal and one vision. We seek team members who are: Humble - No one is above another; we all work together to meet our clients’ needs and we acknowledge our own weaknesses Hungry - We all are driven internally to be successful and to continually expand our contribution and impact Smart - We use emotional intelligence when working with one another and with clients Our culture shapes our actions, our products, and the relationships we forge with our customers. Who We Are KLDiscovery provides technology-enabled services and software to help law firms, corporations, government agencies and consumers solve complex data challenges. The company, with offices in 26 locations across 17 countries, is a global leader in delivering best-in-class eDiscovery, information governance and data recovery solutions to support the litigation, regulatory compliance, internal investigation and data recovery and management needs of our clients. Serving clients for over 30 years, KLDiscovery offers data collection and forensic investigation, early case assessment, electronic discovery and data processing, application software and data hosting for web-based document reviews, and managed document review services. In addition, through its global Ontrack Data Recovery business, KLDiscovery delivers world-class data recovery, email extraction and restoration, data destruction and tape management. KLDiscovery has been recognized as one of the fastest growing companies in North America by both Inc. Magazine (Inc. 5000) and Deloitte (Deloitte’s Technology Fast 500). Additionally, KLDiscovery is an Orange-level Relativity Best in Service Partner, a Relativity Premium Hosting Partner and maintains ISO/IEC 27001 Certified data centers. KLDiscovery is an Equal Opportunity Employer.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Linkedin logo

Join SADA as a Senior Data Engineer (ESS)! Your Mission As a Sr Data Engineer at SADA, you will ensure our customers' support issues are handled effectively. You will work with highly skilled support engineers focused on providing Google Cloud Platform Data Engineering solutions, including BigQuery, Cloud SQL, Google Cloud Monitoring, and related Google services. The Data Engineer is responsible for providing technical assistance and guidance to team members and customers, updating knowledge articles, and enacting improvements to our ServiceNow incident management system, as well as being a SADA ambassador to our clients. Participating in on-call rotations, the Data Engineer must also be technically adept with Google products and be able to seamlessly and effectively partner with other SADA work groups, our partners, and our customers. SADA ESS delivers 24x7 support from a variety of locations around the world. This is primarily a customer-facing role. You will also work closely with SADA's Customer Experience team to execute on their recommendations to our customers. Pathway to Success Our motivation is to provide customers with an exceptional experience in migrating, developing, modernizing, and operationalizing their systems in Google Cloud Platform. Your success starts by positively impacting the direction of a fast-growing practice with vision and passion. You will be measured bi-yearly by the breadth, magnitude, and quality of your contributions, your ability to estimate accurately, customer feedback at the close of projects, how well you collaborate with your peers, and the consultative polish you bring to customer interactions. As you continue to execute successfully, we will build a customized development plan together that leads you through the engineering or management growth tracks. Expectations Required Travel - 10% travel to customer sites, conferences, and other related events. Customer Facing - You will interact with customers on a regular basis, sometimes daily, other times weekly/bi-weekly. Onboarding/Training - Ongoing with first-week orientation followed by a 90-day onboarding schedule. Details of the timeline can be shared. Job Requirements Required Credentials: Google Professional Data Engineer Certified or able to complete within the first 45 days of employment A secondary Google Cloud certification in any other specialization Required Qualifications: 5+ years of experience writing software in at least two or more languages such as Python, Java, Scala, or Go Experience in supporting customers preferably in 24/7 environments Experience in building production-grade data solutions (relational and NoSQL) Experience with systems monitoring/alerting, capacity planning, and performance tuning Experience with BI tools like Tableau, Looker etc will be an advantage Consultative mindset that delights the customer by building good rapport with them to fully understand their requirements and provide accurate solutions Exposure to at least one of the following: Google Cloud DataFlow: building batch/streaming ETL pipelines with frameworks such as Apache Beam or Google Cloud DataFlow and working with messaging systems like Pub/Sub, Kafka and RabbitMQ; Auto scaling DataFlow clusters, troubleshooting cluster operation issues Data Integration Tools such as Fivetran, Striim, Data Fusion, etc. Must have hands-on experience configuring and integrating with multiple Data Sources within and outside of Google Cloud Support Large Enterprise Migration: migrating entire cloud or on-prem assets to Google Cloud including Data Lakes, Data Warehouses, Databases, Business Intelligence, Jobs, etc. Provide consultations for optimizing cost, defining methodology, and coming up with a plan to execute the migration. Useful Qualifications: Experience working with Google Cloud data products (CloudSQL, Spanner, Cloud Storage, Pub/Sub, Dataflow, Dataproc, Bigtable, BigQuery, Dataprep, Composer, etc) Experience with IoT architectures and building real-time data streaming pipelines Experience operationalizing Language Models and machine learning models on large datasets Demonstrated leadership and self-direction -- willingness to teach others and learn new techniques Demonstrated skills in selecting the right statistical tools given a data analysis problem Understanding of Chaos Engineering Understanding of PCI, SOC2, and HIPAA compliance standards Understanding of principle of least privilege and security best practices Understanding of cryptocurrency and blockchain technology About SADA An Insight company Values: We built our core values on themes that internally compel us to deliver our best to our partners, our customers and to each other. Ensuring a diverse and inclusive workplace where we learn from each other is core to SADA's values. We welcome people of different backgrounds, experiences, abilities, and perspectives. We are an equal opportunity employer. Hunger Heart Harmony Work with the best: SADA has been the largest Google Cloud partner in North America since 2016 and, for the eighth year in a row, has been named a Google Global Partner of the Year. Business Performance: SADA has been named to the INC 5000 Fastest-Growing Private Companies list for 15 years in a row, garnering Honoree status. CRN has also named SADA on the Top 500 Global Solutions Providers list for the past 5 years. The overall culture continues to evolve with engineering at its core: 3200+ projects completed, 4000+ customers served, 10K+ workloads and 30M+ users migrated to the cloud. SADA India is committed to the safety of its employees and recommends that new hires receive a COVID vaccination before beginning work .

Posted 1 day ago

Apply

3.0 years

6 - 27 Lacs

Delhi, India

On-site

Linkedin logo

About The Opportunity A fast-growing player in the Data & Analytics consulting sector, we build cloud-native data platforms and real-time reporting solutions for enterprises in retail, BFSI, and healthcare. Leveraging Google Cloud’s advanced analytics stack, we turn high-volume data into actionable insights that accelerate digital transformation and revenue growth. Role & Responsibilities Design, develop, and optimize BigQuery data warehouses for petabyte-scale analytics. Build ingestion pipelines using Dataflow, Pub/Sub, and Cloud Storage to ensure reliable, low-latency data availability. Implement ELT/ETL workflows in Python and SQL, applying best practices for partitioning, clustering, and cost control. Create and orchestrate DAGs in Cloud Composer/Airflow to automate data processing and quality checks. Collaborate with analysts and business stakeholders to model datasets, define SLAs, and deliver high-impact dashboards. Harden production environments with CI/CD, Terraform, monitoring, and automated testing for zero-defect releases. Skills & Qualifications Must-Have 3+ years building data pipelines on Google Cloud Platform. Expert hands-on experience with BigQuery optimisation and SQL performance tuning. Proficiency in Python scripting for data engineering tasks. Deep understanding of Dataflow or Apache Beam streaming and batch paradigms. Solid grasp of data warehousing principles, partitioning, and metadata management. Version control, containerisation, and CI/CD exposure (Git, Docker, Cloud Build). Preferred Terraform/IaC for infrastructure provisioning. Experience migrating on-prem warehouses (Teradata, Netezza) to BigQuery. Knowledge of data governance, DLP, and security best practices in GCP. Benefits & Culture Highlights Industry-leading GCP certifications paid and supported. Product-grade engineering culture with peer mentorship and hackathons. On-site, collaboration-rich workplace designed for learning and innovation. Skills: apache beam,ci/cd,sql,python,git,cloud storage,dataflow,docker,bigquery,airflow,cloud build,terraform,data warehousing,gcp data engineer (bigquery)

Posted 1 day ago

Apply

3.0 years

6 - 27 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About The Opportunity A fast-growing player in the Data & Analytics consulting sector, we build cloud-native data platforms and real-time reporting solutions for enterprises in retail, BFSI, and healthcare. Leveraging Google Cloud’s advanced analytics stack, we turn high-volume data into actionable insights that accelerate digital transformation and revenue growth. Role & Responsibilities Design, develop, and optimize BigQuery data warehouses for petabyte-scale analytics. Build ingestion pipelines using Dataflow, Pub/Sub, and Cloud Storage to ensure reliable, low-latency data availability. Implement ELT/ETL workflows in Python and SQL, applying best practices for partitioning, clustering, and cost control. Create and orchestrate DAGs in Cloud Composer/Airflow to automate data processing and quality checks. Collaborate with analysts and business stakeholders to model datasets, define SLAs, and deliver high-impact dashboards. Harden production environments with CI/CD, Terraform, monitoring, and automated testing for zero-defect releases. Skills & Qualifications Must-Have 3+ years building data pipelines on Google Cloud Platform. Expert hands-on experience with BigQuery optimisation and SQL performance tuning. Proficiency in Python scripting for data engineering tasks. Deep understanding of Dataflow or Apache Beam streaming and batch paradigms. Solid grasp of data warehousing principles, partitioning, and metadata management. Version control, containerisation, and CI/CD exposure (Git, Docker, Cloud Build). Preferred Terraform/IaC for infrastructure provisioning. Experience migrating on-prem warehouses (Teradata, Netezza) to BigQuery. Knowledge of data governance, DLP, and security best practices in GCP. Benefits & Culture Highlights Industry-leading GCP certifications paid and supported. Product-grade engineering culture with peer mentorship and hackathons. On-site, collaboration-rich workplace designed for learning and innovation. Skills: apache beam,ci/cd,sql,python,git,cloud storage,dataflow,docker,bigquery,airflow,cloud build,terraform,data warehousing,gcp data engineer (bigquery)

Posted 1 day ago

Apply

3.0 years

6 - 27 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About The Opportunity A fast-growing player in the Data & Analytics consulting sector, we build cloud-native data platforms and real-time reporting solutions for enterprises in retail, BFSI, and healthcare. Leveraging Google Cloud’s advanced analytics stack, we turn high-volume data into actionable insights that accelerate digital transformation and revenue growth. Role & Responsibilities Design, develop, and optimize BigQuery data warehouses for petabyte-scale analytics. Build ingestion pipelines using Dataflow, Pub/Sub, and Cloud Storage to ensure reliable, low-latency data availability. Implement ELT/ETL workflows in Python and SQL, applying best practices for partitioning, clustering, and cost control. Create and orchestrate DAGs in Cloud Composer/Airflow to automate data processing and quality checks. Collaborate with analysts and business stakeholders to model datasets, define SLAs, and deliver high-impact dashboards. Harden production environments with CI/CD, Terraform, monitoring, and automated testing for zero-defect releases. Skills & Qualifications Must-Have 3+ years building data pipelines on Google Cloud Platform. Expert hands-on experience with BigQuery optimisation and SQL performance tuning. Proficiency in Python scripting for data engineering tasks. Deep understanding of Dataflow or Apache Beam streaming and batch paradigms. Solid grasp of data warehousing principles, partitioning, and metadata management. Version control, containerisation, and CI/CD exposure (Git, Docker, Cloud Build). Preferred Terraform/IaC for infrastructure provisioning. Experience migrating on-prem warehouses (Teradata, Netezza) to BigQuery. Knowledge of data governance, DLP, and security best practices in GCP. Benefits & Culture Highlights Industry-leading GCP certifications paid and supported. Product-grade engineering culture with peer mentorship and hackathons. On-site, collaboration-rich workplace designed for learning and innovation. Skills: apache beam,ci/cd,sql,python,git,cloud storage,dataflow,docker,bigquery,airflow,cloud build,terraform,data warehousing,gcp data engineer (bigquery)

Posted 1 day ago

Apply

3.0 years

6 - 27 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About The Opportunity A fast-growing player in the Data & Analytics consulting sector, we build cloud-native data platforms and real-time reporting solutions for enterprises in retail, BFSI, and healthcare. Leveraging Google Cloud’s advanced analytics stack, we turn high-volume data into actionable insights that accelerate digital transformation and revenue growth. Role & Responsibilities Design, develop, and optimize BigQuery data warehouses for petabyte-scale analytics. Build ingestion pipelines using Dataflow, Pub/Sub, and Cloud Storage to ensure reliable, low-latency data availability. Implement ELT/ETL workflows in Python and SQL, applying best practices for partitioning, clustering, and cost control. Create and orchestrate DAGs in Cloud Composer/Airflow to automate data processing and quality checks. Collaborate with analysts and business stakeholders to model datasets, define SLAs, and deliver high-impact dashboards. Harden production environments with CI/CD, Terraform, monitoring, and automated testing for zero-defect releases. Skills & Qualifications Must-Have 3+ years building data pipelines on Google Cloud Platform. Expert hands-on experience with BigQuery optimisation and SQL performance tuning. Proficiency in Python scripting for data engineering tasks. Deep understanding of Dataflow or Apache Beam streaming and batch paradigms. Solid grasp of data warehousing principles, partitioning, and metadata management. Version control, containerisation, and CI/CD exposure (Git, Docker, Cloud Build). Preferred Terraform/IaC for infrastructure provisioning. Experience migrating on-prem warehouses (Teradata, Netezza) to BigQuery. Knowledge of data governance, DLP, and security best practices in GCP. Benefits & Culture Highlights Industry-leading GCP certifications paid and supported. Product-grade engineering culture with peer mentorship and hackathons. On-site, collaboration-rich workplace designed for learning and innovation. Skills: apache beam,ci/cd,sql,python,git,cloud storage,dataflow,docker,bigquery,airflow,cloud build,terraform,data warehousing,gcp data engineer (bigquery)

Posted 1 day ago

Apply

1.0 years

4 - 7 Lacs

Delhi

On-site

GlassDoor logo

Job Title: Nursing/Health Care Assistant Location: Oman Employment Type: Full-Time (rotational shifts, weekend availability) Salary: 250 to 300 OMR per month Reports To: RNs / LPNs / Nurse Manager Job Summary We are seeking a compassionate and dedicated Nursing/Health Care Assistant to support our nursing and rehabilitation team in delivering exceptional patient care. Under the supervision of RNs/LPNs, you will assist with daily living activities, monitor vital signs, maintain hygiene and safety, support therapy sessions, manage feeding and incontinence, perform light housekeeping, and assist with admissions, transfers, and transportation. Key Responsibilities 1. Personal Care & Activities of Daily Living Assist patients with bathing, grooming, dressing, toileting, and incontinence care. Support mobility: transfers, ambulation, positioning, turning to prevent bedsores, and range-of-motion exercises. Provide tube feeding and feeding assistance when necessary. 2. Observation & Monitoring Measure and record vital signs (BP, pulse, temperature, respiration) and intake/output per shift. Observe and document changes in behaviour, mood, physical condition, or signs of distress/aggression, and report promptly. Assist in restraining patients as per rehabilitation protocols. 3. Therapeutic Support Aid physiotherapists and participate in group or individual therapy sessions. Escort patients in emergency and non-emergency situations within the facility or to outpatient (OPD) appointments and events. 4. Medical & Equipment Care Support light medical tasks under supervision (e.g., non‑sterile dressings, routine equipment/supply care). Perform inventory checks and ensure medical supplies/equipment are organized and functional. 5. Environment & Safety Ensure patient rooms are clean and hygienic: change linens, sanitize equipment, tidy rooms. Maintain infection control, follow health & safety protocols, and supervise patients to prevent falls or harm. 6. Admissions, Transfers & Documentation Assist with patient admissions, transfers, and discharges. Accurately record care activities, observations, vitals, feeding, and output in patient charts. 7. Emotional & Companionship Support Provide compassionate companionship, basic patient education, and emotional support. Qualifications & Skills ANM diploma (2‑year) or CNA/Healthcare Assistant certification. 1–3 years minimum healthcare or GNM/BSc or relevant qualification; 3+ years preferred. CPR/BLS certification advantageous. Valid Dataflow clearance (for international candidates). Strong interpersonal, communication, empathy, and confidentiality skills. Physically able to lift up to ~50 lbs, stand for long periods, and perform patient transfers. Working Hours & Benefits Schedule : Rotational shifts; weekend availability. Benefits : Free Joining Ticket (Will be reimbursed after the 3 months’ Probation period) 30 Days paid Annual leave after 1 year of service completion Yearly Up and Down Air Ticket Medical Insurance Life Insurance Accommodation (Chargeable up to OMR 20/-) Note: Interested candidates please call us at 97699 11050 or 99302 65888 , or email your CV to recruitment@thegrowthhive.org . Job Type: Full-time Pay: ₹40,000.00 - ₹60,000.00 per month Benefits: Food provided Health insurance Provident Fund Schedule: Monday to Friday Rotational shift Weekend availability Work Location: In person

Posted 1 day ago

Apply

15.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position Overview Job Title: Lead Engineer Location: Pune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank What we'll offer you: As part of our flexible scheme, here are just some of the benefits that you'll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities: The candidate is expected to; Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing – Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases – Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud – GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles – Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality – experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Key Skills: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you: Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 day ago

Apply

6.0 - 10.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Develop, optimize, and maintain scalable data pipelines using Python and PySpark. Design and implement data processing workflows leveraging GCP services such as: BigQuery Dataflow Cloud Functions Cloud Storage

Posted 2 days ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

What you’ll do With moderate supervision, manage project's progress, metadata collection, development and management. Perform investigations on internal / external stakeholder queries with high level direction from the Team Leader Analyze problems, identify root cause, formulate findings and observations of results, suggest resolutions and communicate to internal / external stakeholders with moderate guidance from the Team Leader. Maintain current knowledge of industry regulatory requirements such as reporting mandates, concepts and procedures, compliance requirements, and regulatory framework and structure. Be able to support internal/external queries on data standards. Enter/maintain information in documentation repository. Follow established security protocols, identify and report potential vulnerabilities. Perform intermediate level data quality checks, following established procedures. What Experience You Need BS degree in a STEM major or equivalent discipline; Master’s Degree strongly preferred 2+ years of experience as a data engineer or related role Cloud certification strongly preferred Intermediate skills using programming languages - Python, SQL (Big Query) or scripting languages Basic understanding and experience with Google Cloud Platforms and an overall understanding of cloud computing concepts Experience building and maintaining simple data pipelines, following guidelines, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects Experience supporting the design and implementation of basic data models Demonstrates proficient Git usage and contributes to team repositories What could set you apart Master's Degree Experience with GCP (Cloud certification strongly preferred) Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Airflow, GCP dataflow etc. Experience with AI or Machine Learning Experience with Data Visualisation Tools such as Tableau or Looker Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools . Agile environments (e.g. Scrum, XP) Relational databases Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 2 days ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. What you’ll do With moderate supervision, manage project's progress, metadata collection, development and management. Perform investigations on internal / external stakeholder queries with high level direction from the Team Leader Analyze problems, identify root cause, formulate findings and observations of results, suggest resolutions and communicate to internal / external stakeholders with moderate guidance from the Team Leader. Maintain current knowledge of industry regulatory requirements such as reporting mandates, concepts and procedures, compliance requirements, and regulatory framework and structure. Be able to support internal/external queries on data standards. Enter/maintain information in documentation repository. Follow established security protocols, identify and report potential vulnerabilities. Perform intermediate level data quality checks, following established procedures. What Experience You Need BS degree in a STEM major or equivalent discipline; Master’s Degree strongly preferred 2+ years of experience as a data engineer or related role Cloud certification strongly preferred Intermediate skills using programming languages - Python, SQL (Big Query) or scripting languages Basic understanding and experience with Google Cloud Platforms and an overall understanding of cloud computing concepts Experience building and maintaining simple data pipelines, following guidelines, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects Experience supporting the design and implementation of basic data models Demonstrates proficient Git usage and contributes to team repositories What could set you apart Master's Degree Experience with GCP (Cloud certification strongly preferred) Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Airflow, GCP dataflow etc. Experience with AI or Machine Learning Experience with Data Visualisation Tools such as Tableau or Looker Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc. We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 2 days ago

Apply

4.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes using GCP services such as BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer ,Cloud Function, Cloud RUN Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Implement data integration solutions to ingest, process, and store large volumes of structured and unstructured data from various sources. Optimize and tune data pipelines for performance, reliability, and cost-efficiency. Ensure data quality and integrity through data validation, cleansing, and transformation processes. Develop and maintain data models, schemas, and metadata to support data analytics and reporting. Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal disruption to data workflows. Stay up-to-date with the latest GCP technologies and best practices, and provide recommendations for continuous improvement. Mentor and guide junior data engineers, fostering a culture of knowledge sharing and collaboration. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 4 to 6 years of experience in data engineering, with a strong focus on GCP. Proficiency in GCP services such as BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer ,Cloud Function, Cloud RUN. Strong programming skills in Python, PLSQL. Experience with SQL and NoSQL databases. Knowledge of data warehousing concepts and best practices. Familiarity with data integration tools and frameworks. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work in a fast-paced, dynamic environment.

Posted 2 days ago

Apply

9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: We are seeking an experienced Data Architect to design, implement, and optimize scalable data solutions on Amazon Web Services (AWS) and / or Google Cloud Platform (GCP). The ideal candidate will lead the development of enterprise-grade data architectures that support analytics, machine learning, and business intelligence initiatives while ensuring security, performance, and cost optimization. Who we are looking for: Primary Responsibilities: Key Responsibilities Architecture & Design: Design and implement comprehensive data architectures using AWS or GCP services Develop data models, schemas, and integration patterns for structured and unstructured data Create solution blueprints, technical documentation, architectural diagrams, and best practice guidelines Implement data governance frameworks and ensure compliance with security standards Design disaster recovery and business continuity strategies for data systems Technical Leadership: Lead cross-functional teams in implementing data solutions and migrations Provide technical guidance on cloud data services selection and optimization Collaborate with stakeholders to translate business requirements into technical solutions Drive adoption of cloud-native data technologies and modern data practices Platform Implementation: Implement data pipelines using cloud-native services (AWS Glue, Google Dataflow, etc.) Configure and optimize data lakes and data warehouses (S3 / Redshift, GCS / BigQuery) Set up real-time streaming data processing solutions (Kafka, Airflow, Pub / Sub) Implement automated data quality monitoring and validation processes Establish CI/CD pipelines for data infrastructure deployment Performance & Optimization: Monitor and optimize data pipeline performance and cost efficiency Implement data partitioning, indexing, and compression strategies Conduct capacity planning and scaling recommendations Troubleshoot complex data processing issues and performance bottlenecks Establish monitoring, alerting, and logging for data systems Skill: Bachelor’s degree in Computer Science, Data Engineering, or related field 9+ years of experience in data architecture and engineering 5+ years of hands-on experience with AWS or GCP data services Experience with large-scale data processing and analytics platforms AWS Redshift, S3, Glue, EMR, Kinesis, Lambda AWS Data Pipeline, Step Functions, CloudFormation BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub GCP Cloud Functions, Cloud Composer, Deployment Manager IAM, VPC, and security configurations SQL and NoSQL databases Big data technologies (Spark, Hadoop, Kafka) Programming languages (Python, Java, SQL) Data modeling and ETL/ELT processes Infrastructure as Code (Terraform, CloudFormation) Container technologies (Docker, Kubernetes) Data warehousing concepts and dimensional modeling Experience with modern data architecture patterns Real-time and batch data processing architectures Data governance, lineage, and quality frameworks Business intelligence and visualization tools Machine learning pipeline integration Strong communication and presentation abilities Leadership and team collaboration skills Problem-solving and analytical thinking Customer-focused mindset with business acumen Preferred Qualifications: Master’s degree in relevant field Cloud certifications (AWS Solutions Architect, GCP Professional Data Engineer) Experience with multiple cloud platforms Knowledge of data privacy regulations (GDPR, CCPA) Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, colour, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment.

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description The Security Platform Engineering team, EPEO is looking for a passionate, experienced DevOps Engineer who is excited to foray into any technology and create innovative products & services. The interested candidate should have experience in designing and implementing frontend technologies, constructing backend APIs, database, setting up infrastructure in cloud and automating repeatable tasks. Also, the engineer should be a team player working with the team of developers from conception to the final product stage. Responsibilities YOUR TYPICAL DAY HERE WOULD BE: Design/Develop APIs using Java or Python and deploy using GCP Services Design, build, and maintain robust and scalable data pipelines to ingest, process, and transform data from various sources using GCP Services Contribute to the design and architecture of our data infrastructure and Automate data pipeline deployment and management Create websites using Angular, CSS, Hugo, JavaScript/TypeScript Automating repeatable tasks, workflows to improve efficiency of processes. Design, build, observability dashboards using Dynatrace, Grafana, Looker etc. Qualifications WHAT YOUR SKILLSET LOOKS LIKE: A relevant Bachelor's or Master’s Degree in computer science / engineering 3+ Experience in developing RESTful endpoints (Python or Java), websites and deploying using GCP Services Proficiency in using GCP services, including Cloud Run, BigQuery, Dataflow, and Google Cloud Storage (GCS). Experience working in DevOps or Agile development team Deep understanding of SRE concepts, including monitoring, alerting, automation, and incident management WOULD BE GREAT IF YOU ALSO BRING: GCP Certification

Posted 2 days ago

Apply

Exploring Dataflow Jobs in India

The dataflow job market in India is currently experiencing a surge in demand for skilled professionals. With the increasing reliance on data-driven decision-making in various industries, the need for individuals proficient in managing and analyzing dataflow is on the rise. This article aims to provide job seekers with valuable insights into the dataflow job landscape in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Delhi

These cities are known for their thriving tech ecosystems and are home to numerous companies actively hiring for dataflow roles.

Average Salary Range

The average salary range for dataflow professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries upwards of INR 12-15 lakhs per annum.

Career Path

In the dataflow domain, a typical career path may involve starting as a Junior Data Analyst or Data Engineer, progressing to roles such as Senior Data Scientist or Data Architect, and eventually reaching positions like Tech Lead or Data Science Manager.

Related Skills

In addition to expertise in dataflow tools and technologies, dataflow professionals are often expected to have proficiency in programming languages such as Python or R, knowledge of databases like SQL, and familiarity with data visualization tools like Tableau or Power BI.

Interview Questions

  • What is dataflow and how is it different from data streaming? (basic)
  • Explain the difference between batch processing and real-time processing. (medium)
  • How do you handle missing or null values in a dataset? (basic)
  • Can you explain the concept of data lineage? (medium)
  • What is the importance of data quality in dataflow processes? (basic)
  • How do you optimize dataflow pipelines for performance? (medium)
  • Describe a time when you had to troubleshoot a dataflow issue. (medium)
  • What are some common challenges faced in dataflow projects? (medium)
  • How do you ensure data security and compliance in dataflow processes? (medium)
  • What are the key components of a dataflow architecture? (medium)
  • Explain the concept of data partitioning in dataflow. (advanced)
  • How would you handle a sudden increase in data volume in a dataflow pipeline? (advanced)
  • What role does data governance play in dataflow processes? (medium)
  • Can you discuss the advantages and disadvantages of using cloud-based dataflow solutions? (medium)
  • How do you stay updated with the latest trends and technologies in dataflow? (basic)
  • What is the significance of metadata in dataflow management? (medium)
  • Walk us through a dataflow project you have worked on from start to finish. (medium)
  • How do you ensure data quality and consistency across different data sources in a dataflow pipeline? (medium)
  • What are some best practices for monitoring and troubleshooting dataflow pipelines? (medium)
  • How do you handle data transformations and aggregations in a dataflow process? (basic)
  • What are the key performance indicators you would track in a dataflow project? (medium)
  • How do you collaborate with cross-functional teams in a dataflow project? (basic)
  • Can you explain the concept of data replication in dataflow management? (advanced)
  • How do you approach data modeling in a dataflow project? (medium)
  • Describe a challenging dataflow problem you encountered and how you resolved it. (advanced)

Closing Remark

As you navigate the dataflow job market in India, remember to showcase your skills and experiences confidently during interviews. Stay updated with the latest trends in dataflow and continuously upskill to stand out in a competitive job market. Best of luck in your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies