Jobs
Interviews

3139 Redshift Jobs - Page 23

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

0 Lacs

India

On-site

Bloomreach is building the world’s premier agentic platform for personalization .We’re revolutionizing how businesses connect with their customers, building and deploying AI agents to personalize the entire customer journey. We're taking autonomous search mainstream, making product discovery more intuitive and conversational for customers, and more profitable for businesses. We’re making conversational shopping a reality, connecting every shopper with tailored guidance and product expertise — available on demand, at every touchpoint in their journey. We're designing the future of autonomous marketing, taking the work out of workflows, and reclaiming the creative, strategic, and customer-first work marketers were always meant to do. And we're building all of that on the intelligence of a single AI engine — Loomi AI — so that personalization isn't only autonomous…it's also consistent.From retail to financial services, hospitality to gaming, businesses use Bloomreach to drive higher growth and lasting loyalty. We power personalization for more than 1,400 global brands, including American Eagle, Sonepar, and Pandora. We are seeking a Senior AI Engineer to join our dynamic team. In this role, you will be instrumental in building data-driven ML/AI algorithms that enhance our search and recommendation systems. Your primary focus will be on data engineering, analysis, transformations, model training, and serving, ensuring practical and scalable applications of machine learning within our products. This position emphasizes productization and the implementation of ML/AI solutions over pure data science and research, making it ideal for professionals thriving in the fast-paced generative AI era. Key Responsibilities Data Engineering & Analysis Slice and dice analytics data to formulate hypotheses and generate ideas to improve search and recommendation performance. Perform comprehensive data transformations to prepare datasets for model training and evaluation. Build and maintain data pipelines using tools like Airflow, Kubeflow, and MLflow to support ML/AI workflows. Model Development & Deployment Design, develop, and enhance machine learning and AI models tailored to product discovery and search functionalities. Conduct feature engineering to extract meaningful insights from historical data, search queries, product catalogs, and images. Collaborate with Data Engineers to integrate and scale ML components to production-level systems capable of handling large-scale data. Ensure seamless deployment of models, maintaining high availability and performance in cloud environments. Algorithm Implementation & Optimization Dive deep into algorithm applicability, performing impact analysis to ensure models meet performance and business objectives. Optimize and build new algorithms to address various challenges in product discovery and search. Productization of ML/AI Solutions Translate data-driven insights and models into actionable product features that enhance user experience. Work closely with Data Science, Product and Engineering teams to implement practical ML/AI applications that drive business outcomes. Continuous Learning & Improvement Stay abreast of the latest advancements in ML/AI, particularly in generative AI and large language models (LLMs). Continuously refine and improve existing models and workflows based on new research and industry trends. Qualifications Educational Background BS/MS degree in Computer Science, Engineering, Mathematics, or a related discipline with a strong mathematical foundation. Experience 5-8 years of experience building ML-driven, fast, and scalable ML/AI algorithms in a corporate or startup environment. Technical Skills Proficient in Python with excellent programming skills. Strong understanding of machine learning and natural language processing technologies, including classification, information retrieval, clustering, knowledge graphs, semi-supervised learning, and ranking. Experience with deep learning frameworks such as PyTorch, Keras, or TensorFlow. Proficient in SQL and experience with data warehouses like Redshift or BigQuery. Experience with big data technologies such as Hadoop, Spark, Kafka, and data lakes for large-scale processing. Strong understanding of data structures, algorithms, and system design for building highly available, high-performance systems. Experience with workflow orchestration and ML pipeline tools such as Airflow, Kubeflow, and MLflow. Specialized Knowledge Strong awareness of recent trends in Generative AI and Large Language Models (LLMs). Experience working with the GenAI stack is highly desirable. Soft Skills Excellent problem-solving and analytical skills with the ability to adapt to new ML technologies. Effective communication skills in English, both verbal and written. Ability to work collaboratively in a fast-paced, agile environment. More things you'll like about Bloomreach: Culture: A great deal of freedom and trust. At Bloomreach we don’t clock in and out, and we have neither corporate rules nor long approval processes. This freedom goes hand in hand with responsibility. We are interested in results from day one. We have defined our 5 values and the 10 underlying key behaviors that we strongly believe in. We can only succeed if everyone lives these behaviors day to day. We've embedded them in our processes like recruitment, onboarding, feedback, personal development, performance review and internal communication. We believe in flexible working hours to accommodate your working style. We work virtual-first with several Bloomreach Hubs available across three continents. We organize company events to experience the global spirit of the company and get excited about what's ahead. We encourage and support our employees to engage in volunteering activities - every Bloomreacher can take 5 paid days off to volunteer*. The Bloomreach Glassdoor page elaborates on our stellar 4.4/5 rating. The Bloomreach Comparably page Culture score is even higher at 4.9/5 Personal Development: We have a People Development Program -- participating in personal development workshops on various topics run by experts from inside the company. We are continuously developing & updating competency maps for select functions. Our resident communication coach Ivo Večeřa is available to help navigate work-related communications & decision-making challenges.* Our managers are strongly encouraged to participate in the Leader Development Program to develop in the areas we consider essential for any leader. The program includes regular comprehensive feedback, consultations with a coach and follow-up check-ins. Bloomreachers utilize the $1,500 professional education budget on an annual basis to purchase education products (books, courses, certifications, etc.)* Well-being: The Employee Assistance Program -- with counselors -- is available for non-work-related challenges.* Subscription to Calm - sleep and meditation app.* We organize ‘DisConnect’ days where Bloomreachers globally enjoy one additional day off each quarter, allowing us to unwind together and focus on activities away from the screen with our loved ones. We facilitate sports, yoga, and meditation opportunities for each other. Extended parental leave up to 26 calendar weeks for Primary Caregivers.* Compensation: Restricted Stock Units or Stock Options are granted depending on a team member’s role, seniority, and location.* Everyone gets to participate in the company's success through the company performance bonus.* We offer an employee referral bonus of up to $3,000 paid out immediately after the new hire starts. We reward & celebrate work anniversaries -- Bloomversaries!* (*Subject to employment type. Interns are exempt from marked benefits, usually for the first 6 months.) Excited? Join us and transform the future of commerce experiences! If this position doesn't suit you, but you know someone who might be a great fit, share it - we will be very grateful! Any unsolicited resumes/candidate profiles submitted through our website or to personal email accounts of employees of Bloomreach are considered property of Bloomreach and are not subject to payment of agency fees.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

India

On-site

We’re on the lookout for a skilled and motivated Data Engineer to join our growing tech team. If you’re passionate about building robust data pipelines, optimizing data workflows, and enabling smart data-driven decisions — we’d love to connect with you! Key Responsibilities: Design, build, and maintain scalable ETL/ELT pipelines Integrate data from multiple sources into centralized data stores Work closely with Data Analysts and Scientists to support analytical needs Optimize data delivery for performance and reliability Ensure data integrity, quality, and compliance Preferred Skills & Experience: 2–5 years of experience in Data Engineering Strong knowledge of SQL, Python, Spark/PySpark Experience with data warehousing (e.g., Snowflake, Redshift, BigQuery) Hands-on with ETL tools, data pipelines, and APIs Familiarity with cloud platforms (Azure, AWS, or GCP)

Posted 2 weeks ago

Apply

5.0 years

20 Lacs

Chandigarh

On-site

About the Role We are seeking a highly experienced and hands-on Fullstack Architect to lead the design and architecture of scalable, enterprise-grade software solutions. This role requires a deep understanding of both frontend and backend technologies, cloud infrastructure, and microservices, with the ability to guide teams through technical challenges and solution delivery. Key Responsibilities Architect, design, and oversee the development of full-stack applications using modern JS frameworks and cloud-native tools. Lead microservice architecture design, ensuring system scalability, reliability, and performance. Evaluate and implement AWS services (Lambda, ECS, Glue, Aurora, API Gateway, etc.) for backend solutions. Provide technical leadership to engineering teams across all layers (frontend, backend, database). Guide and review code, perform performance optimization, and define coding standards. Collaborate with DevOps and Data teams to integrate services (Redshift, OpenSearch, Batch). Translate business needs into technical solutions and communicate with cross-functional stakeholders. Required Skills Deep expertise in Node.js , TypeScript , React.js , Python , Redux , and Jest . Proven experience designing and deploying systems using Microservices architecture . Strong understanding of AWS services: API Gateway, ECS, Lambda, Aurora, Glue, SQS, OpenSearch, Batch. Hands-on with MySQL , Redshift , and writing optimized queries. Advanced knowledge of HTML, CSS, Bootstrap, JavaScript . Familiarity with tools: VS Code , DataGrip , Jira , GitHub , Postman . Strong knowledge of architectural design patterns and security best practices. Job Types: Full-time, Permanent Pay: From ₹2,055,277.41 per year Benefits: Health insurance Leave encashment Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Education: Bachelor's (Preferred) Experience: Full-stack development: 5 years (Required) Location: Chandigarh, Chandigarh (Required) Shift availability: Day Shift (Required) Work Location: In person

Posted 2 weeks ago

Apply

7.0 - 12.0 years

5 - 7 Lacs

Hyderābād

Remote

Job Information Date Opened 07/08/2025 Job Type Full time Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500059 About Us About DATAECONOMY: We are a fast-growing data & analytics company headquartered in Dublin with offices inDublin, OH, Providence, RI, and an advanced technology center in Hyderabad,India. We are clearly differentiated in the data & analytics space via our suite of solutions, accelerators, frameworks, and thought leadership. Job Description We are seeking a highly experienced and hands-on Lead/ Senior Data Engineer to architect, develop, and optimize data solutions in a cloud-native environment. The ideal candidate will have 7–12 years of strong technical expertise in AWS Glue, PySpark, and Python , along with experience designing robust data pipelines and frameworks for large-scale enterprise systems. Prior exposure to the financial domain or regulated environments is a strong advantage. Key Responsibilities: Solution Architecture : Design scalable and secure data pipelines using AWS Glue, PySpark, and related AWS services (EMR, S3, Lambda, etc.) Leadership & Mentorship : Guide junior engineers, conduct code reviews, and enforce best practices in development and deployment. ETL Development : Lead the design and implementation of end-to-end ETL processes for structured and semi-structured data. Framework Building : Develop and evolve data frameworks, reusable components, and automation tools to improve engineering productivity. Performance Optimization : Optimize large-scale data workflows for performance, cost, and reliability. Data Governance : Implement data quality, lineage, and governance strategies in compliance with enterprise standards. Collaboration : Work closely with product, analytics, compliance, and DevOps teams to deliver high-quality solutions aligned with business goals. CI/CD Automation : Set up and manage continuous integration and deployment pipelines using AWS CodePipeline, Jenkins, or GitLab. Documentation & Presentations : Prepare technical documentation and present architectural solutions to stakeholders across levels. Requirements Required Qualifications: 7–12 years of experience in data engineering or related fields. Strong expertise in Python programming with a focus on data processing. Extensive experience with AWS Glue (both Glue Jobs and Glue Studio/Notebooks). Deep hands-on experience with PySpark for distributed data processing. Solid AWS knowledge : EMR, S3, Lambda, IAM, Athena, CloudWatch, Redshift, etc. Proven experience in architecture and managing complex ETL workflows . Proficiency with Apache Airflow or similar orchestration tools. Hands-on experience with CI/CD pipelines and DevOps best practices. Familiarity with data quality , data lineage , and metadata management . Strong experience working in agile/scrum teams. Excellent communication and stakeholder engagement skills. Preferred/Good to Have: Experience in financial services, capital markets, or compliance systems . Knowledge of data modeling , data lakes , and data warehouse architecture . Familiarity with SQL (Athena/Presto/Redshift Spectrum). Exposure to ML pipeline integration or event-driven architecture is a plus. Benefits Flexible work culture and remote options Opportunity to lead cutting-edge cloud data engineering projects Skill-building in large-scale, regulated environments.

Posted 2 weeks ago

Apply

0 years

6 - 8 Lacs

Hyderābād

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Consultant –AWS! Responsibilities Design and deploy scalable, highly available , and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience of working with Oracle ERP Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Master's / Equivalent Job Posting Jul 7, 2025, 7:24:46 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 2 weeks ago

Apply

5.0 years

6 - 9 Lacs

Hyderābād

On-site

DevSecOps Engineer – CL4 Role Overview : As a DevSecOps Engineer , you will actively engage in your engineering craft, taking a hands-on approach to multiple high-visibility projects. Your expertise will be pivotal in delivering solutions that delight customers and users, while also driving tangible value for Deloitte's business investments. You will leverage your extensive DevSecOps engineering craftsmanship and advanced proficiency across multiple programming languages, DevSecOps tools, and modern frameworks, consistently demonstrating your strong track record in delivering high-quality, outcome-focused CI/CD and automation solutions. The ideal candidate will be a dependable team player, collaborating with cross-functional teams to design, develop, and deploy advanced software solutions. Key Responsibilities : Outcome-Driven Accountability: Embrace and drive a culture of accountability for customer and business outcomes. Develop DevSecOps engineering solutions that solve complex automation problems with valuable outcomes, ensuring high-quality, lean, resilient and secure pipelines with low operating costs, meeting platform/technology KPIs. Technical Leadership and Advocacy: Serve as the technical advocate for DevSecOps modern practices, ensuring integrity, feasibility, and alignment with business and customer goals, NFRs, and applicable automation/integration/security practices—being responsible for designing and maintaining code repos, CI/CD pipelines, integrations (code quality, QE automation, security, etc.) and environments (sandboxes, dev, test, stage, production) through IaC, both for custom and package solutions, including identifying, assessing, and remediating vulnerabilities. Engineering Craftsmanship: Maintain accountability for the integrity and design of DevSecOps pipelines and environments while leading the implementation of deployment techniques like Blue-Green, Canary to minimize down-time and enable A/B testing. Be always hands-on and actively engage with engineers to ensure DevSecOps practices are understood and can be implemented throughout the product development life cycle. Resolve any technical issues from implementation to production operations (e.g., leading triage and troubleshooting production issues). Be self-driven to learn new technologies, experiment with engineers, and inspire the team to learn and drive application of those new technologies. Customer-Centric Engineering: Develop lean, and yet scalable and flexible, DevSecOps automations through rapid, inexpensive experimentation to solve customer needs, enabling version control, security, logging, feedback loops, continuous delivery, etc. Engage with customers and product teams to deliver the right automation, security, and deployment practices. Incremental and Iterative Delivery: Adopt a mindset that favors action and evidence over extensive planning. Utilize a leaning-forward approach to navigate complexity and uncertainty, delivering lean, supportable, and maintainable solutions. Cross-Functional Collaboration and Integration: Work collaboratively with empowered, cross-functional teams including product management, experience, engineering, delivery, infrastructure, and security. Integrate diverse perspectives to make well-informed decisions that balance feasibility, viability, usability, and value. Support a collaborative environment that enhances team synergy and innovation. Advanced Technical Proficiency: Possess intermediary knowledge in modern software engineering practices and principles, including Agile methodologies, DevSecOps, Continuous Integration/Continuous Deployment. Strive to be a role model, leveraging these techniques to optimize solutioning and product delivery, ensuring high-quality outcomes with minimal waste. Demonstrate intermediate level understanding of the product development lifecycle, from conceptualization and design to implementation and scaling, with a focus on continuous improvement and learning. Domain Expertise: Quickly acquire domain-specific knowledge relevant to the business or product. Translate business/user needs into technical requirements and automations. Learn to navigate various enterprise functions such as product, experience, engineering, compliance, and security to drive product value and feasibility. Effective Communication and Influence: Exhibit exceptional communication skills, capable of articulating technical concepts clearly and compellingly. Support teammates and product teams through well-structured arguments and trade-offs supported by evidence, evaluations, and research. Learn to create a coherent narrative that align technical solutions with business objectives. Engagement and Collaborative Co-Creation: Able to engage and collaborate with product engineering teams, including customers as needed. Able to build and maintain constructive relationships, fostering a culture of co-creation and shared momentum towards achieving product goals. Support diverse perspectives and consensus to create feasible solutions. The team : US Deloitte Technology Product Engineering has modernized software and product delivery, creating a scalable, cost-effective model that focuses on value/outcomes by leveraging a progressive and responsive talent structure. As Deloitte’s primary internal development team, Product Engineering delivers innovative digital solutions to businesses, service lines, and internal operations with proven bottom-line results and outcomes. It helps power Deloitte’s success. It is the engine that drives Deloitte, serving many of the world’s largest, most respected companies. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. Key Qualifications : § A bachelor’s degree in computer science, software engineering, or a related discipline. An advanced degree (e.g., MS) is preferred but not required. Experience is the most relevant factor. § Strong software engineering foundation with deep understanding of OOP/OOD, functional programming, data structures and algorithms, software design patterns, code instrumentations, etc. § 5+ years proven experience with Python, Bash, PowerShell, JavaScript, C#, and Golang (preferred). § 5+ years proven experience with CI/CD tools (Azure DevOps and GitHub Enterprise) and Git (version control, branching, merging, handling pull requests) to automate build, test, and deployment processes. § 5+ years of hands-on experience in security tools automation SAST/DAST (SonarQube, Fortify, Mend), monitoring/logging (Prometheus, Grafana, Dynatrace), and other cloud-native tools on AWS, Azure, and GCP. § 5+ years of hands-on experience in using Infrastructure as Code (IaC) technologies like Terraform, Puppet, Azure Resource Manager (ARM), AWS Cloud Formation, and Google Cloud Deployment Manager. § 2+ years of hands-on experience with cloud native services like Data Lakes, CDN, API Gateways, Managed PaaS, Security, etc. on multiple cloud providers like AWS, Azure and GCP is preferred. § Strong understanding of methodologies like, XP, Lean, SAFe to deliver high quality products rapidly. § General understanding of cloud providers security practices, database technologies and maintenance (e.g. RDS, DynamoDB, Redshift, Aurora, Azure SQL, Google Cloud SQL) § General knowledge of networking, firewalls, and load balancers. § Strong preference will be given to candidates with AI/ML and GenAI. § Excellent interpersonal and organizational skills, with the ability to handle diverse situations, complex projects, and changing priorities, behaving with passion, empathy, and care. How You will Grow: At Deloitte, our professional development plans focus on helping people at every level of their career to identify and use their strengths to do their best work every day and excel in everything they do. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300653

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Description At Amazon, we strive to be the most innovative and customer centric company on the planet. Come work with us to develop innovative products, tools and research driven solutions in a fast-paced environment by collaborating with smart and passionate leaders, program managers and software developers. This role is based out of our Bangalore corporate office and is for an passionate, dynamic, analytical, innovative, hands-on, and customer-centric Business analyst. Key job responsibilities This role primarily focuses on deep-dives, creating dashboards for the business, working with different teams to develop and track metrics and bridges. Design, develop and maintain scalable, automated, user-friendly systems, reports, dashboards, etc. that will support our analytical and business needs In-depth research of drivers of the Localization business Analyze key metrics to uncover trends and root causes of issues Suggest and build new metrics and analysis that enable better perspective on business Capture the right metrics to influence stakeholders and measure success Develop domain expertise and apply to operational problems to find solution Work across teams with different stakeholders to prioritize and deliver data and reporting Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation Basic Qualifications 5+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience using Advanced SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A3009497

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

Remote

Role We are looking for a Test Engineer who will become part of our team building and testing the Creditsafe data. You will be working closely with the database teams and data engineering to build specific systems facilitating the extraction and transformation of Creditsafe data. Based on the test strategy and approach you will develop, enhance and execute tests that add value to Creditsafe data. You will act as a primary source of guidance to Junior Test Engineers and Test Engineers in all areas of data quality. You will contribute to the team using data quality best practices and techniques. You can confidently communicate test results with your team members and stakeholders using evidence and reports. You act as a mentor and coach to the less experienced members of the test team. You will promote and coach leading practices in data test management, design, and implementation. You will be part of an Agile team and will effectively contribute to the ceremonies, acting as the quality specialist within that team. You are an influencer and will provide leadership in defining and implementing agreed standards and will actively promote this within your team and the wider development community. The ideal candidate has extensive experience in mentorship and leading by example and is able to communicate values consistent with the Creditsafe philosophy of engagement. You have critical thinking skills and can diplomatically communicate within, and outside their areas of responsibility, challenging assumptions where required. Required Skills: Proven working experience as a data test engineer or business data analyst or ETL tester. Technical expertise regarding data models, database design development, data mining and segmentation techniques Strong knowledge of and experience with SQL databases Hands on experience of best engineering practices (handling and logging errors, system monitoring and building human-fault-tolerant applications) Knowledge of statistics and experience using statistical packages for analysing datasets (Excel, SPSS, SAS etc.) is an advantage. Comfortable working with relational databases such as Redshift, Oracle, PostgreSQL, MySQL, and MariaDB (PostgreSQL preferred) Strong analytical skills with the ability to collect, organise, analyse, and disseminate significant amounts of information with attention to detail and accuracy Adept at queries, report writing and presenting findings. BS in Mathematics, Economics, Computer Science, Information Management or Statistics is desirable but not essential A good understanding of cloud technology, preferably AWS and/or Azure DevOps A practical understanding of programming: JavaScript, Python Excellent communication skills Practical experience of testing in an Agile approach Desirable Skills An understanding of version control systems Practical experience of conducting code reviews Practical experience of pair testing and pair programming Primary Responsibilities Reports to Engineering Lead Work as part of the engineering team in data acquisition Designing and implementing processes and tools to monitor and improve the quality of Creditsafe's data. Developing and executing test plans to verify the accuracy and reliability of data. Working with data analysts and other stakeholders to establish and maintain data governance policies. Identifying and resolving issues with the data, such as errors, inconsistencies, or duplication. Collaborating with other teams, such as data analysts and data scientists to ensure the quality of data used for various projects and initiatives. Providing training and guidance to other team members on data quality best practices and techniques. Monitoring and reporting on key data quality metrics, such as data completeness and accuracy. Continuously improving data quality processes and tools based on feedback and analysis. Work closely with their Agile team to promote a whole team approach to quality Documents approaches and processes that improve the quality effort for use byteam members and the wider test function Strong practical knowledge of software testing techniques and the ability to advise on, and select, the correct technique dependent on the problem at hand Conducts analysis of the teams test approach, taking a proactive role in the formulation of the relevant quality criteria in line with the team goals Work with team members to define standards and processes applicable to their area of responsibility Monitor progress of team deliverables, injecting quality concerns in a timely, effective manner Gain a sufficient understanding of the system architecture to inform their test approach and that of the test engineers Creation and maintenance of concise and accurate defect reports in line with the established defect process Job Types: Full-time, Permanent Benefits: Flexible schedule Health insurance Provident Fund Work from home Schedule: Monday to Friday Supplemental Pay: Performance bonus Work Location: In person Speak with the employer +91 9121185668

Posted 2 weeks ago

Apply

5.0 years

4 - 8 Lacs

Mohali

On-site

Company Introduction: - A dynamic company headquartered in Australia. Multi awards winner, recognized for excellence in telecommunications industry. Financial Times Fastest-growing Company APAC 2023. AFR (Australian Financial Review) Fast 100 Company 2022. Great promotion opportunities that acknowledge and reward your hard work. Young, energetic and innovative team, caring and supportive work environment. About You: We are seeking an experienced and highly skilled Data Warehouse Engineer to join our data and analytics team. Data Warehouse Engineer with an energetic 'can do' attitude to be a part of our dynamic IT team. The ideal candidate will have over 5 years of hands-on experience in designing, building, and maintaining scalable data pipelines and reporting infrastructure. You will be responsible for managing our data warehouse, automating ETL workflows, building dashboards, and enabling data-driven decision-making across the organization. Your Responsibilities will include but is not limited to: Design, implement, and maintain robust, scalable data pipelines using Apache NiFi, Airflow, or similar ETL tools. Develop and manage efficient data ingestion and transformation workflows, including web data crawling using Python. Create, optimize, and maintain complex SQL queries to support business reporting needs. Build and manage interactive dashboards and visualizations using Apache Superset (preferred), Power BI, or Tableau. Collaborate with business stakeholders and analysts to gather requirements, define KPIs, and deliver meaningful data insights. Ensure data accuracy, completeness, and consistency through rigorous quality assurance processes. Maintain and optimize the performance of the data warehouse, supporting high-availability and fast query response times. Document technical processes and data workflows for maintainability and scalability. To be successful in this role you will ideally possess: 5+ years of experience in data engineering, business intelligence, or a similar role. Strong proficiency in Python, particularly for data crawling, parsing, and automation tasks. Expert in SQL (including complex joins, CTEs, window functions) for reporting and analytics. Hands-on experience with Apache Superset (preferred), or equivalent BI tools like Power BI or Tableau. Proficient with ETL tools such as Apache NiFi, Airflow, or similar data pipeline frameworks. Experience working with cloud-based data warehouse platforms (e.g., Amazon Redshift, Snowflake, BigQuery, or PostgreSQL). Strong understanding of data modeling, warehousing concepts, and performance optimization. Ability to work independently and collaboratively in a fast-paced environment. Preferred Qualifications: Experience with version control (e.g., Git) and CI/CD processes for data workflows. Familiarity with REST APIs and web scraping best practices. Knowledge of data governance, privacy, and security best practices. Background in the telecommunications or ISP industry is a plus. Job Types: Full-time, Permanent Pay: ₹40,000.00 - ₹70,000.00 per month Benefits: Leave encashment Paid sick time Provident Fund Schedule: Day shift Monday to Friday Supplemental Pay: Overtime pay Performance bonus Work Location: In person

Posted 2 weeks ago

Apply

5.0 - 6.0 years

7 - 8 Lacs

Ahmedabad

On-site

Data Engineer - AWS (Financial Data Reconciliation) Exp 5 -6 years Location - On-Site Ahmedabad Technical Skills: AWS Stack: Redshift, Glue (PySpark), Lambda, Step Functions, CloudWatch, S3, Athena Languages: Python (Pandas, PySpark), SQL (Redshift/PostgreSQL) ETL & Orchestration: Apache Airflow (MWAA), AWS Glue Workflows, AWS Step Functions Data Modeling: Experience with financial/transactional data schemas. Data Architecture: Medallion (bronze/silver/gold) design, lakehouse patterns, slowly changing dimensions Job Type: Contractual / Temporary Contract length: 6 months Pay: ₹60,000.00 - ₹70,000.00 per month Schedule: Day shift Ability to commute/relocate: Ahmedabad, Gujarat: Reliably commute or planning to relocate before starting work (Preferred) Experience: AWS: 5 years (Preferred) Work Location: In person Expected Start Date: 10/07/2025

Posted 2 weeks ago

Apply

2.0 years

4 - 5 Lacs

Indore

On-site

Expertise in functional programming using JavaScript (ES5, ES6) Expertise of UI framework - React/Redux, RXJS Experience with React Native is preferred Preferred experience with a new generation of Web Programming - using Micro Service, REST/JSON, Component UI models Expertise with data visualization flow development along with usage of modern charting and graphical Java script library Preferred experience with Docker based development/deployment platform Preferred experience with AWS Cloud, AWS RedShift or Postgress React/Redux, RXJS, HTML, CSS, Javascript (ES5, ES6), Data visualization and Chart Libraries Experience: Minimum 2-3 Experience There is a 2-year bond with the company. Only those who are willing to commit to this bond should apply. Female candidates preferred only. For more information, please contact on this number: 8827277596 Job Types: Full-time, Permanent Pay: ₹40,000.00 - ₹45,000.00 per month Benefits: Paid sick time Paid time off Schedule: Day shift Supplemental Pay: Yearly bonus Experience: React: 2 years (Required) React Native: 2 years (Required) Work Location: In person Speak with the employer +91 8827277596

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage and passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you’ll do: Lead end to end projects using cloud technologies to solve complex business problems Provide technology expertise to maximize value for clients and project teams Drive strong delivery methodology to ensure projects are delivered on time, within budget and to client’s satisfaction Ensure technology solutions are scalable, resilient, and optimized for performance and cost Guide coach and mentor project team members for continuous learning and professional growth Demonstrate expertise, facilitation, and strong interpersonal skills in internal and client interactions Collaborate with ZS experts to drive innovation and minimize project risks Work globally with team members to ensure a smooth project delivery Bring structure to unstructured work for developing business cases with clients Assist ZS Leadership with business case development, innovation, thought leadership and team initiatives What you’ll bring: Candidates must either be in their junior year of a Bachelor's degree or in their first year of a Master's degree specializing in Business Analytics, Computer Science, MIS, MBA, or a related field with academic excellence 5+ years of consulting experience in leading large-scale technology implementations Strong communication skills to convey technical concepts to diverse audiences Significant supervisory, coaching, and hands on project management skills Extensive experience with major cloud platforms like AWS, Azure and GCP Deep knowledge of enterprise data management, advanced analytics, process automation, and application development Familiarity with industry- standard products and platforms such as Snowflake, Databricks, Redshift, Salesforce, Power BI, Cloud. Experience in delivering projects using agile methodologies Additional skills: Capable of managing a virtual global team for the timely delivery of multiple projects Experienced in analyzing and troubleshooting interactions between databases, operating systems, and applications Travel to global offices as required to collaborate with clients and internal project teams Perks & Benefits: ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel: Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application: Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At: www.zs.com

Posted 2 weeks ago

Apply

1.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job title: Business Analyst, SailPoint (SAS) Success Acceleration Services About SailPoint: SailPoint is the leader in identity security for cloud enterprises. Our identity security solutions secure and enable thousands of companies worldwide, giving our customers unmatched visibility into the entirety of their digital workforce, ensuring workers have the right access to do their job – no more, no less. Built on a foundation of Artificial Intelligence and Machine Learning, our Identity Security Cloud Platform delivers the right level of access to the right identities and resources at the right time — matching the scale, velocity, and changing needs of today’s cloud-oriented, modern enterprise. About the role: The Success Acceleration Services team at SailPoint is looking for someone who is strongly motivated, has a keen sense of responsibility, positive attitude, high energy, strong attention to detail. This role will be to work with the SASP team to provide both day-to-day insights and support for our Services and delivery. This role will involve working with CRM and PSA tools to keep records up to date, forecasting accurately, and provide Services delivery governance ensuring operations are running smoothly. Roadmap for success 30 days: During the first 30 days, you will delve into understanding SailPoint's offerings, organizational structure, and team dynamics. You will have regular check-ins with your mentor, who will assist you in navigating the tools, processes, and active projects that are critical to your role. Familiarize yourself with project management and CRM-type tools alongside understanding the best practices that are used within the organization. Shadow ongoing Business Analyst activities, observing the dynamics of executing tasks and supporting the team that you are working with. 90 days: Take full ownership of administrative tasks and perform these independently. 6 months: At the 6-month mark, you should have developed a keen sense of the current administrative tasks at hand, ensuring clear boundaries between must-haves and nice-to-haves. Build and maintain strong relationships within and outside of the SAS Team. You should be able to point out areas of improvement in our current processes, propose ideas, collaborate with different team members on internal & external initiatives. You will serve as the primary point of contact for administrative requests. 1 year: By the end of your first year, you would have the ability to mentor new resources and grow team capability while successfully managing your own tasks. You will have the knowledge to create and maintain various knowledge bases to support Program development on an ad-hoc basis. Requirements: 2-3 years' experience working as a business analyst or administrative position demonstrating a high degree of productivity and effectiveness. Proven ability to coordinate between cross-functional teams, driving collaboration and resolving conflicts to maintain project/ program momentum Experience working with external stakeholders, for example communicating via email or CRM tools. Demonstrated ability to manage multiple tasks simultaneously and to resolve scheduling and other conflicts to meet all deadlines Highly self-driven and motivated with a strong work ethic & initiative Ability to work effectively in diverse teams, with an awareness of diverse cultural nuances and communication styles Ability to understand client needs, manage expectations, provide updates and deliver solutions that align with business objectives Excellent written and verbal communication skills, and ability to comprehensively and clearly present strategic issues and solutions. Experience in using and building dashboards using spreadsheet software's like Microsoft Excel and Smartsheet a strong plus Experience with Salesforce and ServiceNow Proven skills at cultivating strong working relationships and working well within a team to learn and share knowledge Collaborate with stakeholders to understand their needs and gather detailed business requirements. Analyze data to identify trends, patterns, and insights that inform business decisions. Evaluate internal systems for efficiency, problems, and inaccuracies, and develop and maintain protocols for handling, processing, and cleaning data. Ability to work in multiple time zones, specifically supporting the United States time zones. Education: Bachelor’s degree or equivalent experience (Computer Science or Engineering degree a plus). Preferred: Exposure to Customer Success Delivery and Operations in both large and small companies Proficiency in Redshift, PowerBI, SQL Experience with Identity Management, Security or Governance would be a bonus Certifications: ECBA, PCBA and CBAP are a plus to have About the team: We are a global dynamic, multicultural and multilingual team that thrives in a fast-paced, ever-evolving environment. From technical experts to senior management, we collaborate closely to tackle any situation head-on with a positive mindset. We are goal-driven and solution-focused, turning every challenge into an opportunity while supporting and learning from one another. Our team is passionate, curious, and always ready to dive deep, bringing people together to solve anything unknown and deliver results with professionalism and care. We work hard, move fast and continuously bring fresh ideas to the table, all while fostering a culture of growth, inclusion, and mutual respect. We invest in our people, champion their careers, and ensure our customers and business are always at the forefront. If you are proactive, eager to learn and ready to make a real impact, join us in shaping the future as part of this incredible worldwide operating team. SailPoint is an equal opportunity employer, and we welcome everyone to our team. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status. The 1-year goal expects the candidate to lead projects, mentor new members, and maintain knowledge bases. Can we add following point to Requirements: "Experience mentoring team members, leading initiatives, and contributing to knowledge-sharing through documentation or onboarding programs." This will ensure applicants are prepared for leadership and knowledge management responsibilities. SailPoint is an equal opportunity employer and we welcome all qualified candidates to apply to join our team. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status, or any other category protected by applicable law. Alternative methods of applying for employment are available to individuals unable to submit an application through this site because of a disability. Contact hr@sailpoint.com or mail to 11120 Four Points Dr, Suite 100, Austin, TX 78726, to discuss reasonable accommodations.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Panaji, Goa, India

On-site

About the Project We are seeking an expert Senior Backend Developer to be a principal architect of "Stealth Prop-tech Platform" a groundbreaking digital real estate platform in Dubai. This is a complex initiative to build a comprehensive ecosystem integrating long-term sales, short-term stays, and advanced technologies including AI/ML, data analytics, Web3/blockchain, and conversational AI. You will be responsible for building the robust, scalable, and secure server-side logic that powers the entire platform across web, iOS, and Android clients. This is a critical leadership role in a high-impact project, offering the chance to design and build a sophisticated backend architecture from the ground up. Job Summary As a Senior Backend Developer, you will design, develop, and maintain the core services and APIs that drive the Proptech platform. You will be responsible for everything "under the hood," from database architecture to the business logic that handles property listings, user management, financial transactions, and advanced AI features. You will work closely with frontend and mobile developers, product managers, and data scientists to bring the platform's vision to life, ensuring high performance, reliability, and scalability. Key Responsibilities Design, build, and maintain scalable and secure RESTful APIs to serve all frontend clients (web, iOS, Android). Develop the core business logic for all platform features, including user authentication, property management (CRUD), search algorithms, and monetization systems (subscriptions, payments). Architect and manage the platform's database schema (PostgreSQL), ensuring data integrity, performance, and scalability. Lead the integration of numerous third-party services, including payment gateways (e.g., Stripe), mapping services (Google Maps), messaging APIs (Twilio), and virtual tour providers. Collaborate with the AI/ML team (Workstream 3) to build the data pipelines and API endpoints necessary to support features like recommendation engines and automated property valuations. Work with the Web3 team (Workstream 4) to integrate backend services with blockchain platforms for tokenization and cryptocurrency payment gateways. Implement robust security and data protection measures in line with international standards (e.g., UAE PDPL, Singapore PDPA). Mentor junior backend developers, conduct code reviews, and establish best practices for coding, testing, and deployment. Design and implement a scalable, service-oriented or microservices-based architecture to support long-term growth and feature expansion. Required Skills and Experience 5+ years of experience in backend software development , with a proven track record of building and launching complex, high-traffic applications. Expert proficiency in at least one modern backend programming language and framework (e.g., Python with Django/Flask, Node.js with Express, Go, or Java with Spring). Strong experience designing and building RESTful APIs and service-oriented architectures. Deep expertise in relational database design and management, particularly with PostgreSQL. Hands-on experience with cloud platforms (AWS, Google Cloud, or Azure) and deploying applications in a cloud environment. Solid understanding of software security principles and best practices. Experience with version control systems (Git) and CI/CD pipelines. Preferred Qualifications A Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Experience in the PropTech (Property Technology) or FinTech sectors. Experience working on projects that involve AI/ML, such as building APIs for recommendation systems or predictive models. Familiarity with blockchain concepts and experience integrating backend systems with Web3 technologies. Experience with containerization technologies like Docker and orchestration tools like Kubernetes. Knowledge of big data technologies (e.g., data warehouses like BigQuery/Redshift) and building data pipelines.

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Greetings from Analytix Solutions…!!! We are seeking an experienced and motivated Senior Data Engineer to join our AI & Automation team . The ideal candidate will have 6–8 years of experience in data engineering, with a proven track record of designing and implementing scalable data solutions. A strong background in database technologies, data modeling, and data pipeline orchestration is essential. Additionally, hands-on experience with generative AI technologies and their applications in data workflows will set you apart. In this role, you will lead data engineering efforts to enhance automation, drive efficiency, and deliver data-driven insights across the organization. Company Name: Analytix Business Solutions (US Based MNC) Company At Glance: We are a premier knowledge process outsourcing unit based in Ahmedabad, fully owned by Analytix Solutions LLC, headquartered in the USA. We customize a wide array of business solutions including IT services, Audio-Visual services, Data management services & Finance and accounting services for small and mid-size companies across diverse industries. We partner with and offer our services to Restaurants, Dental services, Dunkin' Donuts franchises, Hotels, Veterinary services, and others including Start-ups from any other industry. For more details about our organization, please click on https://www.analytix.com/ LinkedIn : Analytix Business Solutions (India) Pvt. Ltd. : My Company | LinkedIn Roles & Responsibilities : Design, build, and maintain scalable, high-performance data pipelines and ETL/ELT processes across diverse database platforms. Architect and optimize data storage solutions to ensure reliability, security, and scalability. Leverage generative AI tools and models to enhance data engineering workflows, drive automation, and improve insight generation. Collaborate with cross-functional teams (Data Scientists, Analysts, and Engineers) to understand and deliver on data requirements. Develop and enforce data quality standards, governance policies , and monitoring systems to ensure data integrity. Create and maintain comprehensive documentation for data systems, workflows, and models. Implement data modeling best practices and optimize data retrieval processes for better performance. Stay up-to-date with emerging technologies and bring innovative solutions to the team. Competencies & Skills : Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. 6–8 years of experience in data engineering , designing and managing large-scale data systems. Advanced knowledge of Database Management Systems and ETL/ELT processes . Expertise in data modeling , data quality , and data governance . Proficiency in Python programming , version control systems (Git), and data pipeline orchestration tools . Familiarity with AI/ML technologies and their application in data engineering. Strong problem-solving and analytical skills, with the ability to troubleshoot complex data issues. Excellent communication skills , with the ability to explain technical concepts to non-technical stakeholders. Ability to work independently, lead projects, and mentor junior team members. Commitment to staying current with emerging technologies, trends, and best practices in the data engineering domain. Technology Stacks : Strong expertise in database technologies, including: o SQL Databases : PostgreSQL, MySQL, SQL Server o NoSQL Databases : MongoDB, Cassandra o Data Warehouse/ Unified Platforms : Snowflake, Redshift, BigQuery, Microsoft Fabric Hands-on experience implementing and working with generative AI tools and models in production workflows. Proficiency in Python and SQL , with experience in data processing frameworks (e.g., Pandas, PySpark). Experience with ETL tools (e.g., Apache Airflow, MS Fabric, Informatica, Talend) and data pipeline orchestration platforms . Strong understanding of data architecture , data modeling , and data governance principles . Experience with cloud platforms (preferably Azure ) and associated data services. Our EVP (Employee Value Proposition) : 5 Days working Total 24 Earned & Casual leaves and 8 public holidays Compensatory Off Personal Development Allowances Opportunity to work with USA clients Career progression and Learning & Development Loyalty Bonus Benefits Medical Reimbursement Standard Salary as per market norms Magnificent & Dynamic Culture

Posted 2 weeks ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

We are seeking a highly experienced and hands-on Lead/Senior Data Engineer to architect, develop, and optimize data solutions in a cloud-native environment. The ideal candidate will have 7–12 years of strong technical expertise in AWS Glue, PySpark, and Python, along with experience designing robust data pipelines and frameworks for large-scale enterprise systems. Prior exposure to the financial domain or regulated environments is a strong advantage. Key Responsibilities: Solution Architecture: Design scalable and secure data pipelines using AWS Glue, PySpark, and related AWS services (EMR, S3, Lambda, etc.) Leadership & Mentorship: Guide junior engineers, conduct code reviews, and enforce best practices in development and deployment. ETL Development: Lead the design and implementation of end-to-end ETL processes for structured and semi-structured data. Framework Building: Develop and evolve data frameworks, reusable components, and automation tools to improve engineering productivity. Performance Optimization: Optimize large-scale data workflows for performance, cost, and reliability. Data Governance: Implement data quality, lineage, and governance strategies in compliance with enterprise standards. Collaboration: Work closely with product, analytics, compliance, and DevOps teams to deliver high-quality solutions aligned with business goals. CI/CD Automation: Set up and manage continuous integration and deployment pipelines using AWS CodePipeline, Jenkins, or GitLab. Documentation & Presentations: Prepare technical documentation and present architectural solutions to stakeholders across levels. Requirements: Required Qualifications: 7–12 years of experience in data engineering or related fields. Strong expertise in Python programming with a focus on data processing. Extensive experience with AWS Glue (both Glue Jobs and Glue Studio/Notebooks). Deep hands-on experience with PySpark for distributed data processing. Solid AWS knowledge: EMR, S3, Lambda, IAM, Athena, CloudWatch, Redshift, etc. Proven experience in architecture and managing complex ETL workflows. Proficiency with Apache Airflow or similar orchestration tools. Hands-on experience with CI/CD pipelines and DevOps best practices. Familiarity with data quality, data lineage, and metadata management. Strong experience working in agile/scrum teams. Excellent communication and stakeholder engagement skills. Preferred/Good to Have: Experience in financial services, capital markets, or compliance systems. Knowledge of data modeling, data lakes, and data warehouse architecture. Familiarity with SQL (Athena/Presto/Redshift Spectrum). Exposure to ML pipeline integration or event-driven architecture is a plus. Benefits: Flexible work culture and remote options Opportunity to lead cutting-edge cloud data engineering projects Skill-building in large-scale, regulated environments.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

New Delhi, Delhi, India

Remote

DHIRA Company Overview DHIRA is a leading company specializing in intelligent transformation, where we leverage advanced AI/ML and data-driven solutions to revolutionize business operations. Unlike traditional digital transformation, which focuses on transaction automation, our intelligent transformation encompasses both transactional automation and deep analytics for comprehensive insights. Our expertise in data engineering, data quality, and master data management ensures robust and scalable AI/ML applications. Utilizing cutting-edge technologies across AWS, Azure, GCP, and on-premises Hadoop systems, we deliver efficient and innovative data solutions. Our vision is embodied in the Akashic platform, designed to provide seamless, end-to-end analytics. At DHIRA, we are committed to excellence, driving impactful contributions to the industry. Join us to be part of a dynamic team at the forefront of intelligent transformation Role- Data Architect – Evolution of Databases, Data Modeling, and Modern Data Practices Location : Bangalore, Remote Position Overview: We are seeking a Principal Data Architect with 5+ years of experience who has a comprehensive understanding of the evolution of databases , from OLTP to OLAP, and relational systems to NoSQL, Graph, and emerging Vector Databases . This role requires deep expertise in data modeling , from traditional ER modeling to advanced dimensional, graph, and vector schemas, along with a strong grasp of the history, best practices, and future trends in data management. The ideal candidate will bring both historical context and cutting-edge expertise to architect scalable, high-performance data solutions, driving innovation while maintaining strong governance and best practices. This is a leadership role that demands a balance of technical excellence, strategic vision, and team mentorship. Key Responsibilities: 1. Data Modeling Expertise: – Design and implement Entity-Relationship Models (ER Models) for OLTP systems, ensuring normalization and consistency. – Transition ER models into OLAP environments with robust dimensional modeling, including star and snowflake schemas. – Develop hybrid data models that integrate relational, NoSQL, Graph, and Vector Database schemas. – Establish standards for schema design across diverse database systems, focusing on scalability and query performance. 2. Database Architecture Evolution: – Architect solutions across the database spectrum: • Relational databases (PostgreSQL, Oracle, MySQL) • NoSQL databases (MongoDB, Cassandra, DynamoDB) • Graph databases (Neo4j, Amazon Neptune) • Vector databases (Pinecone, Weaviate, Milvus). – Implement hybrid data architectures combining OLTP, OLAP, NoSQL, Graph, and Vector systems for diverse business needs. – Ensure compatibility and performance optimization across these systems for real-time and batch processing. 3. Data Warehousing and Analytics: – Lead the development of enterprise-scale Data Warehouses capable of supporting advanced analytics and business intelligence. – Design high-performance ETL/ELT pipelines to handle structured and unstructured data with minimal latency. – Optimize OLAP systems for petabyte-scale data storage and low-latency querying. 4. Emerging Database Technologies: – Drive adoption of Vector Databases for AI/ML applications, enabling semantic search and embedding-based queries. – Explore cutting-edge technologies in data lakes, lakehouses, and real-time processing systems. – Evaluate and integrate modern database paradigms, ensuring scalability for future business requirements. 5. Strategic Leadership: – Define the organization’s data strategy , aligning with long-term goals and emerging trends. – Collaborate with business and technical stakeholders to design systems that balance transactional and analytical workloads. – Lead efforts in data governance, ensuring compliance with security and privacy regulations. 6. Mentorship and Innovation: – Mentor junior architects and engineers, fostering a culture of learning and technical excellence. – Promote innovation by introducing best practices, emerging tools, and modern methodologies in data architecture. – Act as a thought leader in database evolution, presenting insights to internal teams and external forums. Required Skills & Qualifications: • Experience: – 6+ years of experience in data architecture, with demonstrated expertise across OLTP, OLAP, NoSQL, Graph, and Vector databases. – Proven experience designing and implementing data models across relational, NoSQL, graph, and vector systems. – A strong understanding of the evolution of databases and their impact on modern data architectures. • Technical Proficiency: – Deep expertise in ER modeling , dimensional modeling, and schema design for modern database systems. – Proficient in SQL and query optimization for relational and analytical databases. – Hands-on experience with NoSQL databases like MongoDB, Cassandra, or DynamoDB. – Strong knowledge of Graph databases (Neo4j, Amazon Neptune) and Vector databases (Pinecone, Milvus, or Weaviate). – Familiarity with modern cloud-based DW platforms (e.g., Snowflake, BigQuery, Redshift) and lakehouse solutions. • Knowledge of Data Practices: – Historical and practical understanding of data practices, from schema-on-write to schema-on-read approaches. – Experience in implementing real-time and batch processing systems for diverse workloads. – Strong grasp of data lifecycle management, governance, and security practices. • Leadership and Communication: – Ability to lead large-scale data initiatives, balancing technical depth and strategic alignment. – Excellent communication skills to articulate complex ideas to technical and non-technical audiences. – Proven ability to mentor and upskill teams, fostering a collaborative environment. Preferred Skills: • Experience integrating Vector Databases into existing architectures for AI/ML workloads. • Knowledge of real-time streaming systems (Kafka, Pulsar) and their integration with modern databases. • Certifications in data-related technologies (e.g., AWS, GCP, Snowflake, Neo4j). • Hands-on experience with BI tools (e.g., Tableau, Power BI) and AI/ML platforms.

Posted 2 weeks ago

Apply

5.0 - 6.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Data Engineer - AWS (Financial Data Reconciliation) Exp: 5 -6 years Location: On-Site Ahmedabad Technical Skills: • AWS Stack: Redshift, Glue (PySpark), Lambda, Step Functions, CloudWatch, S3, Athena • Languages: Python (Pandas, PySpark), SQL (Redshift/PostgreSQL) • ETL & Orchestration: Apache Airflow (MWAA), AWS Glue Workflows, AWS Step Functions • Data Modeling: Experience with financial/transactional data schemas. • Data Architecture: Medallion (bronze/silver/gold) design, lakehouse patterns, slowly changing dimensions

Posted 2 weeks ago

Apply

0.0 - 5.0 years

0 - 0 Lacs

Ahmedabad, Gujarat

On-site

Data Engineer - AWS (Financial Data Reconciliation) Exp 5 -6 years Location - On-Site Ahmedabad Technical Skills: AWS Stack: Redshift, Glue (PySpark), Lambda, Step Functions, CloudWatch, S3, Athena Languages: Python (Pandas, PySpark), SQL (Redshift/PostgreSQL) ETL & Orchestration: Apache Airflow (MWAA), AWS Glue Workflows, AWS Step Functions Data Modeling: Experience with financial/transactional data schemas. Data Architecture: Medallion (bronze/silver/gold) design, lakehouse patterns, slowly changing dimensions Job Type: Contractual / Temporary Contract length: 6 months Pay: ₹60,000.00 - ₹70,000.00 per month Schedule: Day shift Ability to commute/relocate: Ahmedabad, Gujarat: Reliably commute or planning to relocate before starting work (Preferred) Experience: AWS: 5 years (Preferred) Work Location: In person Expected Start Date: 10/07/2025

Posted 2 weeks ago

Apply

0.0 - 12.0 years

0 Lacs

Hyderabad, Telangana

Remote

Job Information Date Opened 07/08/2025 Job Type Full time Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500059 About Us About DATAECONOMY: We are a fast-growing data & analytics company headquartered in Dublin with offices inDublin, OH, Providence, RI, and an advanced technology center in Hyderabad,India. We are clearly differentiated in the data & analytics space via our suite of solutions, accelerators, frameworks, and thought leadership. Job Description We are seeking a highly experienced and hands-on Lead/ Senior Data Engineer to architect, develop, and optimize data solutions in a cloud-native environment. The ideal candidate will have 7–12 years of strong technical expertise in AWS Glue, PySpark, and Python , along with experience designing robust data pipelines and frameworks for large-scale enterprise systems. Prior exposure to the financial domain or regulated environments is a strong advantage. Key Responsibilities: Solution Architecture : Design scalable and secure data pipelines using AWS Glue, PySpark, and related AWS services (EMR, S3, Lambda, etc.) Leadership & Mentorship : Guide junior engineers, conduct code reviews, and enforce best practices in development and deployment. ETL Development : Lead the design and implementation of end-to-end ETL processes for structured and semi-structured data. Framework Building : Develop and evolve data frameworks, reusable components, and automation tools to improve engineering productivity. Performance Optimization : Optimize large-scale data workflows for performance, cost, and reliability. Data Governance : Implement data quality, lineage, and governance strategies in compliance with enterprise standards. Collaboration : Work closely with product, analytics, compliance, and DevOps teams to deliver high-quality solutions aligned with business goals. CI/CD Automation : Set up and manage continuous integration and deployment pipelines using AWS CodePipeline, Jenkins, or GitLab. Documentation & Presentations : Prepare technical documentation and present architectural solutions to stakeholders across levels. Requirements Required Qualifications: 7–12 years of experience in data engineering or related fields. Strong expertise in Python programming with a focus on data processing. Extensive experience with AWS Glue (both Glue Jobs and Glue Studio/Notebooks). Deep hands-on experience with PySpark for distributed data processing. Solid AWS knowledge : EMR, S3, Lambda, IAM, Athena, CloudWatch, Redshift, etc. Proven experience in architecture and managing complex ETL workflows . Proficiency with Apache Airflow or similar orchestration tools. Hands-on experience with CI/CD pipelines and DevOps best practices. Familiarity with data quality , data lineage , and metadata management . Strong experience working in agile/scrum teams. Excellent communication and stakeholder engagement skills. Preferred/Good to Have: Experience in financial services, capital markets, or compliance systems . Knowledge of data modeling , data lakes , and data warehouse architecture . Familiarity with SQL (Athena/Presto/Redshift Spectrum). Exposure to ML pipeline integration or event-driven architecture is a plus. Benefits Flexible work culture and remote options Opportunity to lead cutting-edge cloud data engineering projects Skill-building in large-scale, regulated environments.

Posted 2 weeks ago

Apply

0.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Location Chennai, Tamil Nadu, India Job ID R-230693 Date posted 08/07/2025 Job Title: Engineer Introduction to role:- Are you ready to make a significant impact in the world of biopharmaceuticals? AstraZeneca, a global leader in innovation-driven prescription medicines, is seeking a dedicated Data Engineer to join our Commercial IT Data Analytics & AI (DAAI) team. With operations in over 100 countries and headquarters in the United Kingdom, AstraZeneca offers a unique workplace culture that fosters innovation and collaboration. As a Data Engineer, you will play a crucial role in supporting and enhancing our data platforms built on AWS services. Your expertise in ETL, Data Warehousing, Databricks, and AWS applications will be vital in ensuring business continuity and driving efficiency. Are you up for the challenge? Accountabilities Monitor and maintain the health and performance of production systems and applications. Provide timely incident response, solve, and resolution for technical issues raised by users or monitoring tools. Perform root cause analysis for recurring issues and implement preventive measures. Investigate data anomalies, solve failures, and coordinate with relevant teams for resolution. Collaborate with development and infrastructure teams to support deployments and configuration changes. Maintain and update technical documentation, standard operating procedures, and knowledge bases. Ensure alignment to service-level agreements (SLAs) and minimize downtime or service disruptions. Manage user access, permissions, and security-related requests as per organizational policies. Participate in on-call rotations and provide after-hours support as needed. Communicate effectively with collaborators, providing status updates and post-incident reports. Proactively find opportunities for automation and process improvement in support activities. Support data migration, upgrades, and transitions as required. Support business continuity and disaster recovery exercises as required.. Essential Skills/Experience Education Background: B.E/B.Tech/MCA/MSc/BSc Overall Years of Experience: 3 to 5 years of experience Solid experience with SQL, data warehousing, and building ETL pipelines Hands-on experience with AWS services, including EMR, EC2, S3, Athena, RDS, Databricks, and Redshift. Skilled in working with columnar databases such as Redshift, Cassandra or BigQuery. Good understanding of ETL processes and data warehousing concepts. Familiarity with scheduling tools (especially Airflow is a plus). Able to write complex SQL queries for data extraction, transformation, and reporting. Excellent communication skills and ability to work well with both technical and non-technical teams. Strong analytical and troubleshooting skills in complex data environments Desirable Skills/Experience Experience with Databricks or Snowflake Proficient in scripting and programming languages such as Shell Scripting and Python Familiar with CI/CD using Bamboo Proficient in version control systems, including Bitbucket and GitHub Preferably experienced with release management processes Significant prior experience in an IT environment within the pharmaceutical or healthcare industry At AstraZeneca, we are committed to driving exciting transformation on our journey to becoming a digital and data-led enterprise. Our work connects across the entire business to power each function, influencing patient outcomes and improving lives. By unleashing the power of our latest innovations in data, machine learning, and technology, we turn complex information into life-changing insights. Join us to work alongside leading experts in our specialist communities, where your contributions are recognized from the top. Ready to take the next step? Apply now to become part of our dynamic team! Date Posted 09-Jul-2025 Closing Date 13-Jul-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.

Posted 2 weeks ago

Apply

9.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

** Hiring for our customer - a well known Data/Cloud Engineering company ** Job Title - Lead Cloud Data Engineer Job Locations - Noida, Bengaluru, Indore Experience - 9 Years to 12 Years Hiring hands-on technical lead Hands on exposure with Big Data technologies – PySpark (Data frame and SparkSQL) Strong SQL and Data Warehousing AWS - Create AWS pipeline with required AWS services i.e S3,IAM, Glue, EMR, Redshift etc. Orchestration with Airflow and Any job scheduler experience Develop efficient ETL pipelines as per business requirements Team Leading experience is a Must.

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Description About Amazon.com: Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. By giving customers more of what they want - low prices, vast selection, and convenience - Amazon.com continues to grow and evolve as a world-class e-commerce platform. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of innovation that is part of the company's DNA. The world's brightest technology minds come to Amazon.com to research and develop technology that improves the lives of shoppers and sellers around the world. About Team The RBS group is an integral part of Amazon online product lifecycle and buying operations. The team is designed to ensure Amazon remains competitive in the online retail space with the best price, wide selection and good product information. The team’s primary role is to create and enhance retail selection on the worldwide Amazon online catalog. The tasks handled by this group have a direct impact on customer buying decisions and online user experience. Overview of the role The Business research Analyst will be responsible for continuous improvement projects across the RBS teams leading to each of its delivery levers. The long-term goal of Research analyst (RA) role is to eliminate Defects and automate qualifying tasks. Secondary goals is to improve the vendor or customer experience, and to enhance GMS/ FCF. This will require collaboration with local and global teams, which have process and technical expertise. Therefore, RA should be a self-starter who is passionate about discovering and solving complicated problems, learning complex systems, working with numbers, and organizing and communicating data and reports. RA should works across team(s) and Ops organization at country, regional and/or cross regional level to drive improvements and enablers to implement solutions for customer, cost savings in process workflow, systems configuration and performance metrics. Leads projects and opportunities across the Operations (FCs, Sortation, logistic centres, Supply Chain, Transportation, Engineering ...) that are business critical, and may be global in nature. RA should perform Big data analysis to identify the defects patterns/process gaps and come up with long term solutions to eliminate the defects/issues. RA should Writes clear and detailed functional specifications based on business requirements as well as writes and reviews business cases. Key Responsibilities for this Role:- Scoping, driving and delivering complex projects across multiple teams. Performs root cause analysis by understand the data need, get data / pull the data and analyze it to form the hypothesis and validate it using data. Dive deep to drive product pilots, build and analyze large data sets, and construct problem hypotheses that help steer the product feature roadmap (e.g. with use of R,SAS, STATA, Matlab, Python or JAVA), tools for database (e.g. SQL, Redshift) and ML tools (Rapid Miner, Eider) Build programs to create a culture of continuous improvement within the business unit, and foster a customer-centric focus on the quality, productivity, and scalability of our services. Find the scalable solution for business problem by executing pilots and build Deterministic and ML model (plug and play on readymade ML models and python skills). Manages meetings, business and technical discussions regarding their part of the projects. Makes recommendations and decisions that impact development schedules and the success for a product or project. Drives team(s)/partners to meet program and/or product goals. Coordinates design effort between internal team and External team to develop optimal solutions for their part of project for Amazon’s network. Supports identification of down-stream problems (i.e. system incompatibility, resource unavailability) and escalate them to the appropriate level before they become project-threatening. Performs supporting research, conduct analysis of the bigger part of the projects and effectively interpret reports to identify opportunities, optimize processes, and implement changes within their part of project. Ability to convince and interact with stakeholders at all level either to gather data and information or to execute and implement according to the plan. Ability to deal with ambiguity and problem solver Build reports from established data warehouses and self-service reporting tools Communicate ideas effectively and with influence (both verbally and in writing), within and outside the team. Key Performance Areas Solve large and complex business problems by aligning multiple teams together. Data analytics and Data Sciences Machine learning Basic Qualifications 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Bachelor's degree 3+ year on Python or R experience 1+ year experience in financial/business analysis 2+ year on SQL 2+ year ML project experience 2+ year on use experience of data analysis packages (Numpy, Pandas, Scipy etc.) Preferred Qualifications Knowledge of data modeling and data pipeline design NLP and Text Processing Deep Learning Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ - F07 Job ID: A2968108

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

Remote

JOB_POSTING-3-72216-5 Job Description Role Title: VP, Data Engineering Tech Lead (L12) Company Overview COMPANY OVERVIEW: Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #5 among India’s Best Companies to Work for 2023, #21 under LinkedIn Top Companies in India list, and received Top 25 BFSI recognition from Great Place To Work India. We have been ranked Top 5 among India’s Best Workplaces in Diversity, Equity, and Inclusion, and Top 10 among India’s Best Workplaces for Women in 2022. We offer 100% Work from Home flexibility for all our Functional employees and provide some of the best-in-class Employee Benefits and Programs catering to work-life balance and overall well-being. In addition to this, we also have Regional Engagement Hubs across India and a co-working space in Bangalore Organizational Overview Organizational Overview: This role will be part of the Data Architecture & Analytics group part of CTO organization Data team is responsible for designing and developing scalable data pipelines for efficient data ingestion, transformation, and loading(ETL). Collaborating with cross-functional teams to integrate new data sources and ensure data quality and consistency. Building and maintaining data models to facilitate data access and analysis by Data Scientists and Analysts. Responsible for the SYF public cloud platform & services. Govern health, performance, capacity, and costs of resources and ensure adherence to service levels Build well defined processes for cloud application development and service enablement. Role Summary/Purpose We are seeking a highly skilled Cloud Technical Lead with expertise in Data Engineering who will work in multi-disciplinary environments harnessing data to provide valuable impact for our clients. The Cloud Technical Lead will work closely with technology and functional teams to drive migration of legacy on-premises data systems/platforms to cloud-based solutions. The successful candidate will need to develop intimate knowledge of SYF key data domains (originations, loan activity, collection, etc.) and maintain a holistic view across SYF functions to minimize redundancies and optimize the analytics environment. Key Responsibilities Manage end-to-end project lifecycle, including planning, execution, and delivery of cloud-based data engineering projects. Providing guidance on suitable options, designing, and creating data pipeline for the analytical solutions across data lake, data warehouses and cloud implementations. Architect and design robust data pipelines and ETL processes leveraging Ab Initio and Amazon Redshift. Ensure data integration, transformation, and storage process are optimized for scalability and performance in cloud environment. Ensure data security, governance, and compliance in the cloud infrastructure. Provide leadership and guidance to data engineering teams, ensuring best practices are followed. Ensure timely delivery of high-quality solutions in an Agile environment. Required Skills/Knowledge Minimum 10+ years of experience with Bachelor's degree in Computer Science or similar technical field of study or in lieu of a degree 12+ years of relevant experience Minimum 10+ years of experience in managing large scale data platforms (Data warehouse/Data Late/Cloud) environments Minimum 10+ years of financial services experience Minimum 6+ years of experience working with Data Warehouses/Data Lake/Cloud. 6+ years’ of hards-on programming experience in ETL tools - Ab Initio or Informatica highly preferred. Be able to read and reverse engineer the logic in Ab Initio graphs. Hands on experience with cloud platforms such as S3, Redshift, Snowflake, etc. Working knowledge of Hive, Spark, Kafka and other data lake technologies. Strong familiarity with data governance, data lineage, data processes, DML, and data architecture control execution. Experience to analyze system requirements and implement migration methods for existing data. Ability to develop and maintain strong collaborative relationships at all levels across IT and the business. Excellent written and oral communication skills, along with a strong ability to lead and influence others. Experience working iteratively in a fast-paced agile environment. Demonstrated ability to drive change and work effectively across business and geographical boundaries. Expertise in evaluating technology and solution engineering, with strong focus on architecture and deployment of new technology Superior decision-making, client relationship, and vendor management skills. Desired Skills/Knowledge Prior work experience in a credit card/banking/fintech company. Experience dealing with sensitive data in a highly regulated environment. Demonstrated implementation of complex and innovative solutions. Agile experience using JIRA or similar Agile tools. Eligibility Criteria Bachelor's degree in Computer Science or similar technical field of study (Masters degree preferred) Minimum 12+ years of experience in managing large scale data platforms (Data warehouse/Data Late/Cloud) environments Minimum 12+ years of financial services experience Minimum 8+ years of experience working with Oracle Data Warehouses/Data Lake/Cloud 8+ years’ of programming experience in ETL tools - Ab Initio or Informatica highly preferred. Be able to read and reverse engineer the logic in Ab Initio graphs. Hands on experience with cloud platforms such as S3, Redshift, Snowflake, etc. Rigorous data analysis through SQL in Oracle and various Hadoop technologies. Involvement in large scale data analytics migration from on premises to a public cloud Strong familiarity with data governance, data lineage, data processes, DML, and data architecture control execution. Experience to analyze system requirements and implement migration methods for existing data. Excellent written and oral communication skills, along with a strong ability to lead and influence others. Experience working iteratively in a fast-paced agile environment. Work Timings: 3:00 PM IST to 12:00 AM IST (WORK TIMINGS: This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details .) For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L10+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L10+ Employees can apply Level / Grade : 12 Job Family Group Information Technology

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Analytics Good to have skills : Microsoft SQL Server, Python (Programming Language), AWS Redshift Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary The purpose of the Data Engineering function within the Data and Analytics team is to develop and deliver great data assets and data domain management for our Personal Banking customers and colleagues seamlessly and reliably every time. As a Senior Data Engineer, you will bring expertise on data handling, curation and conformity capabilities to the team; support the design and development of solutions which assist analysis of data to drive tangible business benefit; and assist colleagues in developing solutions that will enable the capture and curation of data for analysis, analytical and/or reporting purposes. The Senior Data Engineer must be experience working as part of an agile team to develop a solution in a complex enterprise. Roles & Responsibilities Hands on development experience in Data Warehousing, and or Software Development Experience utilising tools and practices to build, verify and deploy solutions in the most efficient ways Experience in Data Integration and Data Sourcing activities Experience developing data assets to support optimised analysis for customer and regulatory outcomes. Provide ongoing support for platforms as required e.g. problem and incident management Experience in Agile software development including Github, Confluence, Rally Professional & Technical Skills Experience with cloud technologies, especially AWS (S3, Redshift, Airflow), DevOps and DataOps tools (Jenkins, Git, Erwin) Advanced SQL and Python user Knowledge of UNIX, Spark and Databricks Additional Information Position: Senior Analyst, Data Engineering Reports to: Manager, Data Engineering Division: Personal Bank Group: 3 Industry/domain skills: Some expertise in Retail Banking, Business Banking and or Wealth Management preferred

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies