Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 8.0 years
8 - 10 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled and motivated Senior Snowflake Developer to join our growing data engineering team. In this role, you will be responsible for building scalable and secure data pipelines and Snowflake-based architectures that power data analytics across the organization. You ll collaborate with business and technical stakeholders to design robust solutions in an AWS environment and play a key role in driving our data strategy forward. Responsibilities Design, develop, and maintain efficient and scalable Snowflake data warehouse solutions on AWS. Build robust ETL/ELT pipelines using SQL, Python, and AWS services (e.g., Glue, Lambda, S3). Collaborate with data analysts, engineers, and business teams to gather requirements and design data models aligned with business needs. Optimize Snowflake performance through best practices in clustering, partitioning, caching, and query tuning. Ensure data quality, accuracy, and completeness across data pipelines and warehouse processes. Maintain documentation and enforce best practices for data architecture, governance, and security. Continuously evaluate tools, technologies, and processes to improve system reliability, scalability, and performance. Ensure compliance with relevant data privacy and security regulations (e.g., GDPR, CCPA). Bachelor s degree in Computer Science, Information Technology, or a related field. Minimum 6 years of experience in data engineering, with at least 3 years of hands-on experience with Snowflake.
Posted 2 months ago
4.0 - 10.0 years
6 - 9 Lacs
Chennai
Work from Office
Experienced. Chennai. Posted 1 week ago. Solvedge. We’re dedicated to leveraging technology to make a positive impact in healthcare. Our software solutions are crafted to optimize processes, support patient care, and drive better health outcomes. As we continue to innovate, we’re seeking an experienced PostgreSQL Developer to join our team. If you’re enthusiastic about scalable database development and eager to contribute to meaningful healthcare technology projects, we want you on our journey to empower healthcare professionals with advanced tools and insights... What You’ll Do. We are looking for a skilled and detail-oriented PostgreSQL Developer with 4–6 years of hands-on experience to join our dynamic engineering team. In this role, you will be responsible for designing developing, and optimizing PostgreSQL databases that power high-performance applications in the healthcare sector. You will collaborate with architects, backend engineers, and business analysts to deliver reliable and scalable data solutions. Responsibilities. Database Development and Optimization. Design and implement efficient PostgreSQL schemas, indexes, constraints, and relationships.. Develop advanced SQL queries, stored procedures, views, and triggers using PostgreSQL.. Optimize complex queries and database performance for scalability and speed.. Perform data profiling, query tuning, and performance analysis.. Data Architecture and Modeling. Create and maintain logical and physical data models based on business requirements.. Define standards for data consistency, normalization, and integrity.. Implement data validation rules and constraints to ensure data accuracy.. Integration and Collaboration. Collaborate with backend developers to ensure seamless data access through APIs and services.. Design and implement ETL processes for internal data flows and external data ingestion.. Work with cross-functional teams to translate business requirements into database logic.. Tools and Automation. Utilize tools for database versioning (e.g., Flyway, Liquibase).. Automate database deployments and migrations within CI/CD pipelines.. Continuous Improvement. Monitor emerging PostgreSQL features and best practices.. Recommend and implement improvements in data design, coding practices, and performance strategy.. Qualifications. Bachelor’s degree in Computer Science, Engineering, or equivalent technical field.. 4–6 years of professional experience with PostgreSQL database development.. Experience working in Agile/Scrum environments.. Exposure to microservices and cloud-native applications is an advantage.. Primary Skills. PostgreSQL: Strong proficiency in PostgreSQL and advanced SQL.. SQL Development: Experience building reusable stored procedures, functions, views, CTEs, and triggers.. Performance Tuning: Expertise in optimizing complex queries using indexing, execution plans, and materialized views.. Schema Design: In-depth knowledge of data modeling, normalization, and relational design.. Data Integration: Experience with data pipelines, ETL processes, and transforming structured/semi-structured data.. JSON/JSONB: Practical experience working with unstructured data and PostgreSQL’s advanced JSON features.. ORMs: Experience integrating PostgreSQL with ORMs such as Sequelize, Hibernate, or SQLAlchemy.. Secondary Skills. Experience working with cloud-based PostgreSQL (e.g., AWS RDS, Azure Database for PostgreSQL).. Familiarity with RESTful APIs and backend service integration.. Working knowledge of NoSQL alternatives, hybrid storage strategies, or data lakes.. CI/CD and DevOps understanding for integrating DB updates into pipelines.. Strong analytical and debugging skills.. Effective communication and documentation abilities to interact with stakeholders.. Why Apply?. Even if you feel you don’t meet every single requirement, we encourage you to apply. We’re looking for passionate individuals who may bring diverse perspectives and skills to our team. At SolvEdge, we value talent and dedication and are committed to fostering growth within our organization.. How to Apply?. Ready to make a difference? Submit your resume, a cover letter that highlights your qualifications, and any relevant experience. We look forward to hearing from you!. SolvEdge is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.. About SolvEdge. Solvedge: Pioneering the Future of Digital Healthcare. Our Expertise. SOLVEDGE stands at the forefront of digital healthcare innovation as a premier healthcare performance company. With over 18 years of dedicated service in the healthcare industry, we specialize in a digital care journey platform that revolutionizes how hospitals and health systems engage, monitor, and connect with patients throughout their healthcare experiences. Our partnership with Fortune 100 medical device companies and hospitals nationwide underscores our position as a trusted partner in healthcare solutions.. Key Features of SOLVEDGE. Our Platform Is Designed To Empower Healthcare Providers With The Tools They Need To Automate And Streamline Care Delivery, Thereby Improving Clinical Outcomes And Patient Satisfaction. Personalized Care Plans: Leveraging evidence-based data, SOLVEDGE delivers digital care plans customized to meet the individual needs and conditions of each patient.. Real-Time Patient Monitoring: Through daily health checks, assessment, surveys, and integration with wearable devices, our platform facilitates continuous monitoring of patient health.. Automated Care Delivery: We automate essential tasks, including appointment scheduling, sending reminders, and delivering educational content, to enhance patient engagement and reduce administrative tasks.. Remote Patient Monitoring: Healthcare providers can monitor vital signs, symptoms, and treatment plan adherence remotely, enabling timely interventions and proactive care management.. The SOLVEDGE Advantage. Our platform offers significant benefits to healthcare providers and patients alike:. Improved Clinical Outcomes: By facilitating more effective care pathways and enabling early intervention, SOLVEDGE contributes to reduced readmission rates, fewer emergency department visits, and shorter hospital stays.. Enhanced Patient Satisfaction: Patients enjoy a higher quality of care with SOLVEDGE, benefiting from improved communication, comprehensive education, and continuous support.. Cost Savings: Healthcare organizations can achieve substantial cost reductions by minimizing unnecessary readmission, emergency visits, and complications associated with poor care management.. Applications and Impact. SOLVEDGE’s versatility allows for its application across various aspects of healthcare, with a particular emphasis on surgical care. From preparing patients for surgery to monitoring their post-operative recovery, our platform ensures a seamless and supportive care journey. Beyond surgical care, our focus encompasses managing care pathways, enhancing patient engagement through patient-reported outcomes, providing advanced data analytic, integrating with electronic medical records (EMR), and streamlining billing processes. Our comprehensive approach addresses the myriad challenges faced by today’s healthcare. industry, backed by our commitment to excellence in service, communication, and customer. experience.. A Trusted Partner in Healthcare Innovation. Our strategic relationships and deep understanding of healthcare challenges have positioned us as an indispensable ally to healthcare providers nationwide. As we continue to develop innovative solutions, our goal remains unchanged: to simplify healthcare delivery, improve patient outcomes, and enhance the overall patient experience.. Job Features. Job Category Developer. Apply For This Job. Attach Resume*. No file chosen. Browse. Show more Show less
Posted 2 months ago
8.0 - 13.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Are you passionate about building scalable data systems and driving data quality across complex ecosystems? Join Derisk360 to work on advanced cloud and data engineering initiatives that power intelligent business decision-making. What You ll Do: Work with a broad stack of AWS services: S3, AWS Glue, Glue Catalog, Lambda, Step Functions, EventBridge , and more. Develop and implement robust data quality checks using DQ libraries. Lead efforts in data modeling and manage relational and NoSQL databases. Build and automate ETL workflows using Unix scripting DevOps and Agile methodologies , including use of CI/CD tools and code repositories. Engineer scalable big data solutions with Apache Spark Design impactful dashboards using Amazon QuickSight Microsoft Power BI Work extensively on Integrate real-time data pipelines with data sourcing strategies, including real-time integration solutions. Spearhead cloud migration efforts to Azure Data Lake , including data transitions from on-premise environments. What You Bring: 8+ years of hands-on experience in data engineering roles. Proficiency in AWS cloud services and modern ETL technologies Solid programming experience in Strong understanding of data architecture quality frameworks reporting tools Experience working in Agile environments and using version control/CI pipelines. Exposure to big data frameworks real-time integration tools cloud data platforms What You ll Get: Competitive compensation. Lead and contribute to mission-critical data engineering projects. Work in a high-performance team at the intersection of Continuous learning environment with access to cutting-edge technologies.
Posted 2 months ago
8.0 - 13.0 years
25 - 30 Lacs
Hyderabad
Work from Office
About the Role We are hiring a Staff Data Engineer to join our India Operations and play a crucial role in our mission to establish a world-class data engineering team within the Center for Data and Insights (CDI). Reporting directly to the Director of Data Engineering, you will be a key contributor, advancing our data engineering capabilities in the AWS and GCP ecosystems. Your responsibilities include collaborating with key stakeholders, guiding and mentoring fellow data engineers, and working hands-on in various domains such as data architecture, data lake infrastructure, data and ML job orchestration. Your contributions will ensure the consistency and reliability of data and insights, aligning with our objective of enabling well-informed decision-making. The ideal candidate will demonstrate an empathetic and service-oriented approach, fostering a thriving data and insights culture while enhancing and safeguarding our data infrastructure. This role presents a unique opportunity to build and strengthen our data engineering platforms at a global level. If you are an experienced professional with a passion for impactful data engineering initiatives and a commitment to driving transformative changes, we encourage you to explore this role. Joining us as a Staff Data Engineer allows you to significantly contribute to the trajectory of our CDI, making a lasting impact on our data-centric aspirations as we aim for new heights. Core Areas of Responsibility Implement robust data infrastructure, platforms, and solutions. Collaborate effectively with cross functional teams & CDI leaders, by ensuring the delivery of timely data load and jobs tailored to their unique needs. Guide & mentor the team of skilled data engineers, by prioritizing a service-oriented approach and quick response times. Advocate for the enhancement, and adherence to high data quality standards, KPI certification methods, and engineering best practices. Approach reporting platforms and analytical processes with innovative thinking, considering the evolving demands of the business. Implement the strategy for migrating from AWS to GCP with near real time events, machine learning pipelines using our customer data platform (Segment) and purpose built pipelines and DBs to activate systems of intelligence. Continuously improve reporting workflows and efficiency, harnessing the power of automation whenever feasible. Enhance the performance, reliability, and scalability of storage and compute layers of the data lake. About You We get excited about candidates, like you, because... 8+ years of hands-on experience in data engineering and/or software development. Highly skilled in programming languages like Python, Spark & SQL Comfortable using BI tools like Tableau, Looker, Preset, and so on Proficient in utilizing event data collection tools such as Snowplow, Segment, Google Tag Manager, Tealium, mParticle, and more. Comprehensive expertise across the entire lifecycle of implementing compute and orchestration tools like Databricks, Airflow, Talend, and others. Skilled in working with streaming OLAP engines like Druid, ClickHouse, and similar technologies. Experience leveraging AWS services including EMR Spark, Redshift, Kinesis, Lambda, Glue, S3, and Athena, among others. Nice to have exposure to GCP services like BigQuery, Google Storage, Looker, Google Analytics, and so on Good understanding of building real-time data systems as well AI/ML personalization products Experience with Customer Data Platforms (CDPs) and Data Management Platforms (DMPs), contributing to holistic data strategies. Familiarity with high-security environments like HIPAA, PCI, or similar contexts, highlighting a commitment to data privacy and security. Accomplished in managing large-scale data sets, handling Terabytes of data and billions of records effectively. You holds a Bachelors degree in Computer Science, Information Systems, or a related field, providing a strong foundational knowledge
Posted 2 months ago
8.0 - 13.0 years
25 - 30 Lacs
Hyderabad
Work from Office
We are seeking a seasoned Data Engineering Manager with 8+ years of experience to lead and grow our data engineering capabilities. This role demands strong hands-on expertise in Python, SQL, Spark , and advanced proficiency in AWS and Databricks . As a technical leader, you will be responsible for architecting and optimizing scalable data solutions that enable analytics, data science, and business intelligence across the organization. Key Responsibilities: Lead the design, development, and optimization of scalable and secure data pipelines using AWS services such as Glue, S3, Lambda, EMR , and Databricks Notebooks , Jobs, and Workflows. Oversee the development and maintenance of data lakes on AWS Databricks , ensuring performance and scalability. Build and manage robust ETL/ELT workflows using Python and SQL , handling both structured and semi-structured data. Implement distributed data processing solutions using Apache Spark / PySpark for large-scale data transformation. Collaborate with cross-functional teams including data scientists, analysts, and product managers to ensure data is accurate, accessible, and well-structured. Enforce best practices for data quality, governance, security , and compliance across the entire data ecosystem. Monitor system performance, troubleshoot issues, and drive continuous improvements in data infrastructure. Conduct code reviews, define coding standards, and promote engineering excellence across the team. Mentor and guide junior data engineers, fostering a culture of technical growth and innovation. Requirements 8+ years of experience in data engineering with proven leadership in managing data projects and teams. Expertise in Python, SQL, Spark (PySpark) ,
Posted 2 months ago
4.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
???? We’re Hiring: Delivery Solution Architect – Data Analytics (AWS) ????. ???? Location: Remote. ???? Level: Mid to Senior (3–5+ years experience). Are you passionate about turning complex data challenges into elegant, scalable solutions on the AWS cloud? We're looking for a Delivery Solution Architect – Data Analytics to join our growing team and take the lead in architecting and delivering next-gen data platforms that drive real business impact.. ???? About the Role:. As a Delivery Solution Architect, you will play a pivotal role in designing and implementing end-to-end data analytics solutions on AWS. You’ll collaborate with cross-functional teams and lead a group of 5–10 consultants and engineers to bring modern data architectures to life—powering business intelligence, machine learning, and operational insights.. ????? Key Responsibilities:. Lead the design and delivery of data analytics solutions using AWS services (Redshift, EMR, Glue, Kinesis, etc.). Collaborate with project teams, clients, and sales stakeholders to craft technical proposals and solution blueprints. Design scalable, secure, high-performance data models and ETL pipelines. Optimize data platforms for cost-efficiency, query performance, and concurrency. Ensure data governance, security, and compliance with best practices. Troubleshoot technical issues and provide mentorship to engineering teams. Stay ahead of industry trends and bring innovative solutions to the table. Report to practice leads, contribute to documentation, and support deployment activities. ???? Qualifications:. 3–5 years of experience as a Solution Architect or Technical Lead in data analytics delivery. Hands-on expertise with AWS data tools (Redshift, EMR, Glue, Kinesis, etc.). Proficiency in SQL and Python; strong data modeling and ETL experience. Knowledge of Microsoft Azure Data Analytics tools is a plus. Experience working in Agile teams and using version control (e.g., Git). Strong communication skills and ability to collaborate with technical and non-technical stakeholders. AWS Certifications (Solutions Architect & Data Analytics – Specialty) are required. ???? Preferred Skills:. Team leadership in project delivery environments. Familiarity with data governance, data quality, and metadata management. Documentation, proposal writing, and client engagement skills. ???? What’s In It For You?. Opportunity to work with advanced AWS data technologies. Be part of a collaborative, innovation-focused team. Shape data strategies that directly impact enterprise decision-making. Career growth in a cloud-first, analytics-driven environment. ???? Ready to architect the future of data? Apply now or reach out to learn more!. #AWSJobs #DataAnalytics #SolutionArchitect #Hiring #AWSCareers #CloudComputing #DataEngineering #Redshift #Glue #AzureData #TechJobs. Show more Show less
Posted 2 months ago
3.0 - 4.0 years
11 - 14 Lacs
Mumbai
Work from Office
AEP Data Architect. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Key Responsibilities. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Requirements & Qualifications. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Show more Show less
Posted 2 months ago
7.0 - 8.0 years
12 - 14 Lacs
Noida
Work from Office
Experience in developing and managing dashboards and reports in Tableau In-depth knowledge and a sound understanding of RDBMS systems, SQL, Business Intelligence, and data analytics. Excellent analytical skills to forecast and predict trends and insights using past and current data Knowledge of data architecture, data modelling, data mapping, data analysis, and data visualization Able to build visually-stunning and interactive dashboards in Tableau. Good to have knowledge of Power BI Mandatory Competencies Reporting - Tableau Reporting - Power BI Data Analysis - Data Analysis Database - SQL Beh - Communication and collaboration At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, were committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees success and happiness.
Posted 2 months ago
5.0 - 10.0 years
35 - 40 Lacs
Mumbai
Work from Office
We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Data Platform Engineering Lead at JPMorgan Chase within Asset and Wealth Management, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Lead the design, development, and implementation of scalable data pipelines and ETL batches using Python/PySpark on AWS. Execute standard software solutions, design, development, and technical troubleshooting Use infrastructure as code to build applications to orchestrate and monitor data pipelines, create and manage on-demand compute resources on cloud programmatically, create frameworks to ingest and distribute data at scale. Manage and mentor a team of data engineers, providing guidance and support to ensure successful product delivery and support. Collaborate proactively with stakeholders, users and technology teams to understand business/technical requirements and translate them into technical solutions. Optimize and maintain data infrastructure on cloud platform, ensuring scalability, reliability, and performance. Implement data governance and best practices to ensure data quality and compliance with organizational standards. Monitor and troubleshoot application and data pipelines, identifying and resolving issues in a timely manner. Stay up-to-date with emerging technologies and industry trends to drive innovation and continuous improvement. Add to team culture of diversity, equity, inclusion, and respect. Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software development and data engineering, with demonstrable hands-on experience in Python and PySpark. Proven experience with cloud platforms such as AWS, Azure, or Google Cloud. Good understanding of data modeling, data architecture, ETL processes, and data warehousing concepts. Experience or good knowledge of cloud native ETL platforms like Snowflake and/or Databricks. Experience with big data technologies and services like AWS EMRs, Redshift, Lambda, S3. Proven experience with efficient Cloud DevOps practices and CI/CD tools like Jenkins/Gitlab, for data engineering platforms. Good knowledge of SQL and NoSQL databases, including performance tuning and optimization. Experience with declarative infra provisioning tools like Terraform, Ansible or CloudFormation. Strong analytical skills to troubleshoot issues and optimize data processes, working independently and collaboratively. Experience in leading and managing a team/pod of engineers, with a proven track record of successful project delivery. Preferred qualifications, capabilities, and skills Knowledge of machine learning model lifecycle, language models and cloud-native MLOps pipelines and frameworks is a plus. Familiarity with data visualization tools and data integration patterns.
Posted 2 months ago
7.0 - 10.0 years
13 - 18 Lacs
Mumbai
Work from Office
Paramatrix Technologies Pvt. Ltd is looking for Cloud Data Architect (Azure & AWS) to join our dynamic team and embark on a rewarding career journey A Data Architect is a professional who is responsible for designing, building, and maintaining an organization's data architecture 1 Designing and implementing data models, data integration solutions, and data management systems that ensure data accuracy, consistency, and security 2 Developing and maintaining data dictionaries, metadata, and data lineage documents to ensure data governance and compliance 3 Data Architect should have a strong technical background in data architecture and management, as well as excellent communication skills 4 Strong problem-solving skills and the ability to think critically are also essential to identify and implement solutions to complex data issues
Posted 2 months ago
3.0 - 6.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm’s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. About the team: The Credit Risk Product Team is at the core of our lending operations, ensuring that our risk assessment models are efficient, scalable, and compliant with regulatory frameworks. The team collaborates closely with data scientists, engineers, and business stakeholders to develop and enhance credit risk models, decisioning frameworks, and risk mitigation strategies. By leveraging advanced analytics and machine learning, the team continuously refines underwriting processes to optimize loan performance and reduce defaults. About the Role: W e are looking for a motivated and curious AI & Data Product Analyst to join our team. This role requires a deep understanding of real time and batch data pipelines and data modelling, including data ingestion, preparation, and integration to support various business cases. The analyst will be the subject matter expert on data availability and structure, guiding stakeholders on how best to leverage existing data assets. The analyst will continuously analyze the data assets and identify opportunities to optimize for cost and processing efficiency. The role also requires a deep understanding of data architecture for conversational AI systems, The analyst will also analyze user interactions and product performance to generate actionable insights that drive continuous improvement of AI-driven conversational experiences. You will work closely with data scientists, engineers, product managers, and business stakeholders to understand data requirements, design robust data solutions, and ensure the accuracy and availability of data for our analytical tools. Expectations/ : 1. Collaborate with business teams to understand their data, analytical requirements. 2. Work closely with engineering teams to ensure business needs are met through effective feature development. 3. Contribute to the development and optimization of big data architecture and data pipelines for real-time and batch processing. 4. Support the definition and design of data architecture and pipelines required for conversational AI bots, including data ingestion, processing, annotation, and storage. 5. Analyze interaction data to inform the development and refinement of intent taxonomies, entity extraction, and dialogue management based on data-driven insights. 6. Analyze conversational AI interaction data to extract insights on user behavior, intent recognition accuracy, and dialogue effectiveness.Collaborate with AI engineers to integrate AI frameworks like LangChain, LangFlow, or other agentic AI platforms into product workflows. 7. Monitor and evaluate AI bot performance metrics, identify data quality issues, and recommend improvements.Translate business requirements into technical specifications for data architecture and AI product enhancements. 8. Stay updated on emerging conversational AI frameworks and tools, evaluating their applicability to business needs. Key Skills Required: 1.Ideally have 2-5 years experience working on data analytics and business intelligence. Candidates from b2c consumer internet product companies are preferred. 2. Proven work experience on MS Excel, Google analytics,SQL, Data Studio, any BI Tool, business analyst or similar role. 3. Should be comfortable working in a fast-changing environment and ambiguous. 4. Critical thinking and very detail oriented. 5. In-depth understanding of datasets, data and business understanding. 6. Capable of demonstrating good business judgement. Education: Applicants must have an engineering academic background with specialization in data science . Why join us We aim at bringing half a billion Indians into the mainstream economy, and everyone working here is striving to achieve that goal. Our success is rooted in our people’s collective energy and unwavering focus on the customers, and that’s how it will always be. We are the largest merchant acquirer in India. Compensation If you are the right fit, we believe in creating wealth for you With enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It is your opportunity to be a part of the story!
Posted 2 months ago
5.0 - 8.0 years
8 - 18 Lacs
Mumbai, Hyderabad, Pune
Hybrid
Role & responsibilities Design, implement, and manage cloud-based solutions on AWS and Snowflake. Work with stakeholders to gather requirements and design solutions that meet their needs. Develop and execute test plans for new solutions Oversee and design the information architecture for the data warehouse, including all information structures such as staging area, data warehouse, data marts, and operational data stores. Ability to Optimize Snowflake configurations and data pipelines to improve performance, scalability, and overall efficiency. Deep understanding of Data Warehousing, Enterprise Architectures, Dimensional Modeling, Star & Snowflake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support Significant experience working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models) Maintain Documentation: Develop and maintain detailed documentation for data solutions and processes. Provide Training: Offer training and leadership to share expertise and best practices with the team Collaborate with the team and provide leadership to the data engineering team, ensuring that data solutions are developed according to best practices Preferred candidate profile Snowflake, DBT, Data Architecture Design experience in Data Warehouse. Good to have Informatica or any ETL Knowledge or Hands-On Experience Good to have Databricks understanding 5+ years of IT experience with 3+ years of Data Architecture experience in Data Warehouse, 4+ years in Snowflake
Posted 2 months ago
10.0 - 15.0 years
12 - 17 Lacs
Mumbai
Work from Office
Position Purpose The Data Architect is to support the work for ensuring that systems are designed, upgraded, managed, de-commissioned and archived in compliance with data policy across the full data life cycle. This includes complying with the data strategy and undertaking the design of data models and supporting the management of metadata. The Data Architect mission will integrate a focus on GDPR law, with the contribution to the privacy impact assessment and Record of Process & Activities relating to personal Data. The scope is CIB EMEA and CIB ASIA Responsibilities Direct Responsibilities Engage with key business stakeholders to assist with establishing fundamental data governance processes Define key data quality metrics and indicators and facilitate the development and implementation of supporting standards Help to identify and deploy enterprise data best practices such as data scoping, metadata standardization, data lineage, data deduplication, mapping and transformation and business validation Structures the information in the Information System (any data modelling tool like Abacus), i.e. the way information is grouped, as well as the navigation methods and the terminology used within the Information Systems of the entity, as defined by the lead data architects. Creates and manages data models (Business Flows of Personal Data with process involved) in all their forms, including conceptual models, functional database designs, message models and others in compliance with the data framework policy Allows people to step logically through the Information System (be able to train them to use tools like Abacus) Contribute and enrich the Data Architecture framework through the material collected during analysis, projects and IT validations Update all records in Abacus collected from stakeholder interviews/ meetings. Skill Area Expected Communicating between the technical and the non-technical Is able to communicate effectively across organisational, technical and political boundaries, understanding the context. Makes complex and technical information and language simple and accessible for non- technical audiences. Is able to advocate and communicate what a team does to create trust and authenticity, and can respond to challenge. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. Data Modelling (Business Flows of Data in Abacus) Produces data models and understands where to use different types of data models. Understands different tools and is able to compare between different data models. Able to reverse engineer a data model from a live system. Understands industry recognized data modelling patterns and standards. Understands the concepts and principles of data modelling and is able to produce, maintain and update relevant data models for specific business needs. Data Standards (Rules defined to manage/ maintain Data) Develops and sets data standards for an organisation. Communicates the business benefit of data standards, championing and governing those standards across the organisation. Develops data standards for a specific component. Analyses where data standards have been applied or breached and undertakes an impact analysis of that breach. Metadata Management Understands a variety of metadata management tools. Designs and maintains the appropriate metadata repositories to enable the organization to understand their data assets. Works with metadata repositories to complete and Maintains it to ensure information remains accurate and up to date. The objective is to manage own learning and contribute to domain knowledge building Turning business problems into data design Works with business and technology stakeholders to translate business problems into data designs. Creates optimal designs through iterative processes, aligning user needs with organisational objectives and system requirements. Designs data architecture by dealing with specific business problems and aligning it to enterprise-wide standards and principles. Works within the context of well understood architecture and identifies appropriate patterns. Contributing Responsibilities It is expected that the data architect applies knowledge and experience of the capability, including tools and technique and adopts those that are more appropriate for the environment. The Data Architect needs to have the knowledge of: The Functional & Application Architecture, Enterprise Architecture and Architecture rules and principles The activities Global Market and/or Global Banking Market meta-models, taxonomies and ontologies (such as FpML, CDM, ISO2022) Skill Area Expected Data Communication Uses the most appropriate medium to visualise data to tell compelling and actionable stories relevant for business goals. Presents, communicates and disseminates data appropriately and with high impact. Able to create basic visuals and presentations. Data Governance Understands data governance and how it works in relation to other organisational governance structures. Participates in or delivers the assurance of a service. Understands what data governance is required and contribute to these data governance. Data Innovation Recognises and exploits business opportunities to ensure more efficient and effective performance of organisations. Explores new ways of conducting business and organisational processes Aware of opportunities for innovation with new tools and uses of data Technical & Behavioral Competencies 1. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. 2. Able to create basic visuals and presentations. 3. Experience in working with Enterprise Tools for Data Cataloging and Data Management (like Abacus, collibra, Alation, etc) 4. Experience in working with BI Tools (Like Power BI) 5. Good understanding of Excel (formulas and Functions) Specific Qualifications (if required) Preferred: BE/ BTech, BSc-IT, BSc-Comp, MSc-IT, MSc Comp, MCA Skills Referential Behavioural Skills : (Please select up to 4 skills) Communication skills - oral & written Ability to collaborate / Teamwork Ability to deliver / Results driven Creativity & Innovation / Problem solving Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to understand, explain and support change Ability to develop and adapt a process Ability to anticipate business / strategic evolution Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 10 years Other/Specific Qualifications (if required) 1. Experience in GDPR (General Data Protection Regulation) or in Privacy by Design would be preferred 2. DAMA Certified (Good to have)
Posted 2 months ago
7.0 - 12.0 years
9 - 14 Lacs
Mumbai
Work from Office
Position Purpose The Data Architect is to support the work for ensuring that systems are designed, upgraded, managed, de-commissioned and archived in compliance with data policy across the full data life cycle. This includes complying with the data strategy and undertaking the design of data models and supporting the management of metadata. The Data Architect mission will integrate a focus on GDPR law, with the contribution to the privacy impact assessment and Record of Process & Activities relating to personal Data. The scope is CIB EMEA and CIB ASIA Responsibilities Direct Responsibilities Engage with key business stakeholders to assist with establishing fundamental data governance processes Define key data quality metrics and indicators and facilitate the development and implementation of supporting standards Help to identify and deploy enterprise data best practices such as data scoping, metadata standardization, data lineage, data deduplication, mapping and transformation and business validation Structures the information in the Information System (any data modelling tool like Abacus), i.e. the way information is grouped, as well as the navigation methods and the terminology used within the Information Systems of the entity, as defined by the lead data architects. Creates and manages data models (Business Flows of Personal Data with process involved) in all their forms, including conceptual models, functional database designs, message models and others in compliance with the data framework policy Allows people to step logically through the Information System (be able to train them to use tools like Abacus) Contribute and enrich the Data Architecture framework through the material collected during analysis, projects and IT validations Update all records in Abacus collected from stakeholder interviews/ meetings. Skill Area Expected Communicating between the technical and the non-technical Is able to communicate effectively across organisational, technical and political boundaries, understanding the context. Makes complex and technical information and language simple and accessible for non- technical audiences. Is able to advocate and communicate what a team does to create trust and authenticity, and can respond to challenge. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. Data Modelling (Business Flows of Data in Abacus) Produces data models and understands where to use different types of data models. Understands different tools and is able to compare between different data models. Able to reverse engineer a data model from a live system. Understands industry recognized data modelling patterns and standards. Understands the concepts and principles of data modelling and is able to produce, maintain and update relevant data models for specific business needs. Data Standards (Rules defined to manage/ maintain Data) Develops and sets data standards for an organisation. Communicates the business benefit of data standards, championing and governing those standards across the organisation. Develops data standards for a specific component. Analyses where data standards have been applied or breached and undertakes an impact analysis of that breach. Metadata Management Understands a variety of metadata management tools. Designs and maintains the appropriate metadata repositories to enable the organization to understand their data assets. Works with metadata repositories to complete and Maintains it to ensure information remains accurate and up to date. The objective is to manage own learning and contribute to domain knowledge building Turning business problems into data design Works with business and technology stakeholders to translate business problems into data designs. Creates optimal designs through iterative processes, aligning user needs with organisational objectives and system requirements. Designs data architecture by dealing with specific business problems and aligning it to enterprise-wide standards and principles. Works within the context of well understood architecture and identifies appropriate patterns. Contributing Responsibilities It is expected that the data architect applies knowledge and experience of the capability, including tools and technique and adopts those that are more appropriate for the environment. The Data Architect needs to have the knowledge of: The Functional & Application Architecture, Enterprise Architecture and Architecture rules and principles The activities Global Market and/or Global Banking Market meta-models, taxonomies and ontologies (such as FpML, CDM, ISO2022) Skill Area Expected Data Communication Uses the most appropriate medium to visualise data to tell compelling and actionable stories relevant for business goals. Presents, communicates and disseminates data appropriately and with high impact. Able to create basic visuals and presentations. Data Governance Understands data governance and how it works in relation to other organisational governance structures. Participates in or delivers the assurance of a service. Understands what data governance is required and contribute to these data governance. Data Innovation Recognises and exploits business opportunities to ensure more efficient and effective performance of organisations. Explores new ways of conducting business and organisational processes Aware of opportunities for innovation with new tools and uses of data Technical & Behavioral Competencies 1. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. 2. Able to create basic visuals and presentations. 3. Experience in working with Enterprise Tools (like Abacus, informatica, big data, collibra, etc) 4. Experience in working with BI Tools (Like Power BI) 5. Good understanding of Excel (formulas and Functions) Specific Qualifications (if required) Preferred: BE/ BTech, BSc-IT, BSc-Comp, MSc-IT, MSc Comp, MCA Skills Referential Behavioural Skills : (Please select up to 4 skills) Communication skills - oral & written Ability to collaborate / Teamwork Ability to deliver / Results driven Creativity & Innovation / Problem solving Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to understand, explain and support change Ability to develop and adapt a process Ability to anticipate business / strategic evolution Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 7 years Other/Specific Qualifications (if required) 1. Experience in GDPR (General Data Protection Regulation) or in Privacy by Design would be preferred 2. DAMA Certified.
Posted 2 months ago
7.0 - 10.0 years
30 - 40 Lacs
Pune
Work from Office
Architect Application/Product IV Minimum Education and Experience : T4 Bachelors Degree and 7 years experience Does this position have any direct reports? No Is travel required for this position? Minimal ESSENTIAL DUTIES AND RESPONSIBILITIES Work with business users and stakeholders to define and analyze problems and provide optimal technical solutions. Translate business needs into technical specifications and design functional BI solutions. Present architecture and solutions to executive-level. Adhere to industry best-practices in all phases of design and architecture of the solution. Ensure the robustness and reliability of BI solutions during development, testing, and maintenance. Document all aspects of the BI system for future upgrades and maintenance. Provide guidance to ensure data governance, security, and compliance best practices in the architecture. REQUIRED SKILLS & QUALIFICATIONS TECHNICAL SKILLS: Data Modeling: Expertise in dimensional modeling, normalization/denormalization and other data modeling techniques ETL Processes: Proficiency in extract, transform, and load (ETL) processes. SQL and Database Design: Strong SQL coding skills and knowledge of database design principles. BI Platforms: Experience with any of BI platforms preferably Power BI Data Warehousing: Knowledge of data warehousing concepts and tools Cloud Services: Experience with cloud platforms such as AWS, Azure, and Google Cloud Data Governance: Understanding of data governance and data quality management Scripting Languages: Proficiency in scripting languages like Python, R, or Java MINIMUM QUALIFICATIONS 8 + years of end-to-end design and architecture of enterprise level data platform and reporting/analytical solutions. 5+ years of expertise in real-time and batch reporting, analytical solution architecture. 4+ years of experience with PowerBI, Tableau or similar technology solutions ADDITIONAL QUALIFICATIONS 8 + years of experience with Dimensional modeling and data lake design methodologies. 8+ years of experience with Relational and Non-relational databases (e.g. SQL Server, Cosmos, etc.) Experience with working with business stakeholders, requirements & use case analysis. Strong communication and collaboration skills with creative problem-solving skills. PREFERRED QUALIFICATIONS Bachelor's degree in computer science or equivalent work experience. Experience with Agile/Scrum methodology. Experience with tax and accounting domain a plus. Azure Data Engineer certification a plus.
Posted 2 months ago
3.0 - 6.0 years
20 - 25 Lacs
Bengaluru
Hybrid
Join us as a Data Engineer II in Bengaluru! Build scalable data pipelines using Python, SQL, AWS, Airflow, and Kafka. Drive real-time & batch data systems across analytics, ML, and product teams. A hybrid work option is available. Required Candidate profile 3+ yrs in data engineering with strong Python, SQL, AWS, Airflow, Spark, Kafka, Debezium, Redshift, ETL & CDC experience. Must know data lakes, warehousing, and orchestration tools.
Posted 2 months ago
5.0 - 15.0 years
7 - 17 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Health Care - Data Engineer Architect Job Title: Health Care - Data Engineer Architect Location: Chennai,Bangalore,Hyderabad Experience: 5 - 15 Years Job Title: Job Summary We are seeking a highly experienced Healthcare Data Architect to design and manage robust healthcare data systems that meet regulatory requirements, enable interoperability, and support advanced analytics. The ideal candidate will bring over a decade of expertise in healthcare IT, data modelling, and cloud-based solutions to architect scalable, secure systems that serve EHRs, population health, claims, and real-time analytics use cases. Mandatory Skills Data Architecture (Healthcare domain) HL7, FHIR, EHR/EMR systems Data Warehousing & Data Lakes Cloud Platforms (AWS, Azure, GCP) Data Governance & MDM SQL, NoSQL, Python, Spark Key Responsibilities Architect and implement scalable healthcare data platforms Ensure compliance with HIPAA, GDPR, and other healthcare regulations Design data models for EHR, claims, and clinical data Optimize ETL pipelines and manage data flow integrity Lead data warehouse and data lake development Drive interoperability through standards like HL7 and FHIR Implement data governance and quality frameworks Qualifications Bachelor s or Master s in Computer Science, Information Systems, or related field Certifications in AWS/Azure and healthcare standards (HL7/FHIR) preferred Technical Skills SQL, Python, Spark, Java HL7, FHIR, CCD, JSON, XML Cloud: AWS (Glue, Redshift), Azure (Synapse, Data Factory), GCP BI: Power BI, Tableau Data modeling tools: Erwin, Enterprise Architect Soft Skills Strong analytical and problem-solving ability Excellent communication & stakeholder engagement Team leadership and mentoring Adaptability in fast-paced environments Good to Have Experience with AI/ML in healthcare pipelines Familiarity with population health & claims analytics Regulatory reporting experience (CMS, NCQA) Minimum 10 years in data architecture, with 5+ years in healthcare domain Proven track record in implementing full-cycle data solutions and governance Competitive salary + performance incentives Comprehensive health insurance & wellness programs Learning and development allowance Remote/hybrid flexibility ESOPs for senior leadership (if applicable) Key Result Areas (KRAs) Scalable & compliant data architecture delivery HL7/FHIR integration and uptime Timely milestone delivery & cross-functional collaboration Quality, consistency, and governance of healthcare data Key Performance Indicators (KPIs) Reduction in ETL/data latency and failures Improvement in data quality metrics On-time solution deployment success rate Audit pass rate and compliance score Contact Informations: Click here to upload your CV / Resume We accept PDF, DOC, DOCX, JPG and PNG files SUBMIT APPLICATION Verification code successfully sent to registered email Invalid Verification Code! Thanks for the verification! Our support team will contact you shortly!.
Posted 2 months ago
5.0 - 15.0 years
7 - 17 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Title: Data Architect Location: Chennai,Bangalore,Hyderabad Experience: 5 - 15 Years Job Title: Hyderabad, Chennai & Bangalore Job Summary We are seeking a highly experienced and strategic Data Architect to lead the design and implementation of robust data architecture solutions. The ideal candidate will have a deep understanding of data modelling, governance, integration, and analytics platforms. As a Data Architect, you will play a crucial role in shaping the data landscape across the organization, ensuring data availability, consistency, security, and quality. Mandatory Skills Enterprise data architecture and modeling Cloud data platforms (Azure, AWS, GCP) Data warehousing and lakehouse architecture Data governance and compliance frameworks ETL/ELT design and orchestration Master Data Management (MDM) Databricks architecture and implementation Key Responsibilities Lead, define, and implement end-to-end modern data platforms on public cloud using Databricks Design and manage scalable data models and storage solutions Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to define required data structures, formats, pipelines, metadata, and workload orchestration capabilities Establish data standards, governance policies, and best practices Oversee the integration of new data technologies and tools Lead the development of data pipelines, marts, and lakes Ensure data solutions are compliant with security and regulatory standards Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations Mentor data engineers and developers on best practices Qualifications Bachelor s or Master s degree in Computer Science, Engineering, or a related field Relevant certifications in cloud platforms, data architecture, or governance Technical Skills Data Modelling: Conceptual, Logical, Physical modelling (ERwin, Power Designer, etc.) Cloud: Azure (ADF, Synapse, Databricks), AWS (Redshift, Glue), GCP (Big Query) Databases: SQL Server, Oracle, PostgreSQL, NoSQL (MongoDB, Cassandra) Data Integration: Informatica, Talend, Apache NiFi Big Data: Hadoop, Spark, Kafka Governance Tools: Collibra, Alation, Azure Purview Scripting: Python, SQL, Shell DevOps/DataOps practices and CI/CD tools Soft Skills Strong leadership and stakeholder management Excellent communication and documentation skills Strategic thinking with problem-solving ability Collaborative and adaptive in cross-functional teams Experience in AI/ML data lifecycle support Exposure to industry data standards and frameworks (TOGAF, DAMA-DMBOK) Experience with real-time analytics and streaming data solutions Work Experience Minimum 10 years in data engineering, architecture, or related roles At least 5 years of hands-on experience in designing data platforms on Azure Demonstrated knowledge of 2 full project cycles using Databricks as an architect Experience supporting and working with cross-functional teams in a dynamic environment Advanced working SQL knowledge and experience working with relational databases and unstructured datasets Experience with stream-processing systems such as Storm, Spark-Streaming Competitive salary and annual performance-based bonuses Comprehensive health and optional Parental insurance. Retirement savings plans and tax savings plans. Work-Life Balance: Flexible work hours Key Result Areas (KRAs) Effective implementation of scalable and secure data architecture Governance and compliance adherence Standardization and optimization of data assets Enablement of self-service analytics and data democratization Key Performance Indicators (KPIs) Architecture scalability and reusability metrics Time-to-delivery for data initiatives Data quality and integrity benchmarks Satisfaction ratings from business stakeholders Contact Informations: Click here to upload your CV / Resume We accept PDF, DOC, DOCX, JPG and PNG files SUBMIT APPLICATION Verification code successfully sent to registered email Invalid Verification Code! Thanks for the verification! Our support team will contact you shortly!.
Posted 2 months ago
4.0 - 8.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Job Title: Associate Specialist- Data Engineering Location: Bengaluru Shift : UK Shift About the Role: We are seeking a skilled and experienced Data Engineer to join our team and help build, optimize, and maintain data pipelines and architectures. The ideal candidate will have deep expertise in the Microsoft data engineering ecosystem, particularly leveraging tools such as Azure Data Factory , Databricks , Synapse Analytics , Microsoft Fabric , and a strong command of SQL , Python , and Apache Spark . Key Responsibilities: Design, develop, and optimize scalable data pipelines and workflows using Azure Data Factory , Synapse Pipelines , and Microsoft Fabric . Build and maintain ETL/ELT processes for ingesting structured and unstructured data from various sources. Develop and manage data transformation logic using Databricks (PySpark/Spark SQL) and Python . Collaborate with data analysts, architects, and business stakeholders to understand requirements and deliver high-quality data solutions. Ensure data quality, integrity, and governance across the data lifecycle. Implement monitoring and alerting for data pipelines to ensure reliability and performance. Work with Azure Synapse Analytics to build data models and enable analytics and reporting. Utilize SQL for querying and managing large datasets efficiently. Participate in data architecture discussions and contribute to technical design decisions. Required Skills and Qualifications: data engineering or a related field. Strong proficiency in the Microsoft Azure data ecosystem including: Azure Data Factory (ADF) Azure Synapse Analytics Microsoft Fabric Azure Databricks Solid experience with Python and Apache Spark (including PySpark). Advanced skills in SQL for data manipulation and transformation. Experience in designing and implementing data lakes and data warehouses. Familiarity with data governance, security, and compliance standards. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Preferred Qualifications: Microsoft Azure certifications (e.g., Azure Data Engineer Associate). Experience with DevOps tools and CI/CD practices in data workflows. Knowledge of REST APIs and integration techniques. Background in agile methodologies and working in cross-functional teams.
Posted 2 months ago
2.0 - 6.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Key Responsibilities: Big Data Architecture: Design, build, and implement scalable Big Data solutions to process and analyze vast datasets in a timely and efficient manner. Data Pipeline Development: Develop ETL (Extract, Transform, Load) pipelines for large-scale data processing. Ensure data pipelines are automated, scalable, and robust to handle high volumes of data. Distributed Systems: Work with distributed computing frameworks (e.g., Apache Hadoop , Apache Spark , Flink ) to process big datasets across multiple systems and clusters. Data Integration: Integrate data from multiple sources (structured, semi-structured, and unstructured) into a unified data architecture.
Posted 2 months ago
10.0 - 15.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Total Experience: 10+ Years Relevant Experience: 10+ Years Rate: 11000 INR/day Interview Mode: One F2F Is mandatory to attend ( Kindly avoid the candidate who cannot attend one F2F round ) Candidate should be ready to join as a subcontractor. If relevant & total years of experience is not as mentioned, will be straightforward reject. Profile with higher rate will be straightforward reject. Please treat the below requirement as critical and share 2 quality profiles those are really interested in subcon role. Kindly go through below instruction very clearly and then do the submission with requested details Vendors should check the requirement clearly and not to send the profiles just by key word search Vendors should check the availability and interest of the resource to join as Subcon Kindly submit profiles within the rate card Ensure there is no ex- Infosys Emp profile submission as we have 6 months of cooling period We need only top 1-2 quality profiles, avoid multiple mail thread on profiles submission. ECMS Req # 514780 Number of Openings 1 Duration of Hiring 12 Months Relevant and Total years of experience 10+ Detailed job description - Skill Set: Create, test, and implement enterprise-level apps with Snowflake Design and implement features for identity and access management Create authorization frameworks for better access control Implement Client query optimization, major security competencies with encryption Solve performance issues and scalability issues in the system Transaction management with distributed data processing algorithms Possess ownership right from start to finish Migrate solutions from on-premises setup to cloud-based platforms Understand and implement the latest delivery approaches based on data architecture Project documentation and tracking based on understanding user requirements Perform data integration with third-party tools including architecting, designing, coding, and testing phases Manage documentation of data models, architecture, and maintenance processes Continually review and audit data models for enhancement Performance tuning, user acceptance training, application support Maintain confidentiality of data Risk assessment, management, and mitigation plans Regular engagement with teams for status reporting and routine activities Migration activities from one database to another or on-premises to cloud Mandatory Skills(ONLY 2 or 3) Snowflake Developer Vendor Billing range in local currency (per day) 11000 INR/DAY Work Location Chennai, Hyderabad, Bangalore, or Mysore Infosys location WFO/WFH/Hybrid WFO Hybrid WFO Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) YES/ NO NO Mode of interview F2F BGCHECK before or After onboarding Post Onboarding
Posted 2 months ago
8.0 - 13.0 years
45 - 50 Lacs
Bengaluru
Work from Office
The Lead data Scientist will be responsible for organizing and reporting data related to sales numbers, market research, logistics, linguistics, or other behaviors. They utilize technical expertise to ensure data reported is accurate and high-quality. Data will need to be analysed, designed, and presented in a way that assists individuals, business owners and customer stakeholders to make better decisions. Responsibilities: Cross-Functional Collaboration: Collaborate seamlessly with Engineering, Product, and Operations teams to conceptualise, design, and construct data reporting and analytical systems. Ideation and Analysis: Generate ideas for exploratory analysis, actively shaping the trajectory of future projects. Provide insightful recommendations for strategic actions based on data-driven insights. Rapid Prototyping and Product Discussions: Drive the rapid prototyping of solutions, actively participating in discussions related to product and feature development. Dashboard Creation and Reporting: Develop dashboards and comprehensive documentation to effectively communicate results. Regularly monitor key data metrics, facilitating informed decision-making. Integration with Production Systems: Collaborate closely with software engineers to deploy and integrate data models into production systems. Ensure scalability, reliability, and efficiency of integrated solutions. Business Metrics Identification: Identify and analyze key business metrics, offering strategic insights. Recommend product features based on the identified metrics to enhance overall product functionality. Understand and manage Data Infrastructure: Lead the design and development of scalable, robust, and efficient data architectures. Oversee the development and maintenance of ETL (Extract, Transform, Load) processes to move and transform data from various sources into data warehouses or other storage solutions. Ensure data quality and integrity throughout the data pipeline. Team Leadership: Lead a team of data engineers and data analysts, providing mentorship, guidance, and technical expertise. Coach and manage a team to deliver using Agile processes and ensure high RoI. Participating actively in recruitment and nurturing of engineers as awesome as you. Skills Exceptional quantitative and problem-solving skills - capable of tackling complex data-driven challenges and formulating effective solutions. Proven proficiency - In essential data science libraries, including Pandas, Numpy, SciPy, and Scikit-Learn, for data manipulation, analysis, and modeling. In-depth expertise in Python programming and SQL - Focus on data analysis, model building, and algorithmic implementation. Experience with distributed computing frameworks - Hadoop or Spark, for handling large-scale data processing and machine learning tasks. Data Architecture - Designing and implementing robust data architectures that support the organizations needs. This includes understanding different database systems, data lakes, and data warehouses.
Posted 2 months ago
10.0 - 15.0 years
12 - 17 Lacs
Chennai
Work from Office
We are looking for a BI Architect with 13+ years of experience to lead the design and implementation of scalable BI and data architecture solutions. The role involves driving data modeling, cloud-based pipelines, migration projects, and data lake initiatives using technologies like AWS, Kafka, Spark, SQL, and Python. Experience with EDW modeling and architecture is a strong plus. Key Responsibilities Design and develop scalable BI and data models to support enterprise analytics. Lead data platform migration from legacy BI systems to modern cloud architectures. Architect and manage data lakes, batch and streaming pipelines, and real-time integrations via Kafka and APIs. Support data governance, quality, and access control initiatives. Partner with data engineers, analysts, and business stakeholders to deliver reliable, high-performing data solutions. Contribute to architecture decisions and platform scalability planning Qualifications Should have 10-15 years of relevant experience. 10+ years in BI, data engineering, or data architecture roles. Proficiency in SQL, Python, Apache Spark, and Kafka. Strong hands-on experience with AWS data services (e.g., S3, Redshift, Glue, EMR). Track record of leading data migration and modernization projects. Solid understanding of data governance, security, and scalable pipeline design. Excellent collaboration and communication skills. Good to Have Experience with enterprise data warehouse (EDW) modeling and architecture. Familiarity with BI tools like Power BI, Tableau, Looker, or Quicksight. Knowledge of lakehouse, data mesh, or modern data stack concepts.
Posted 2 months ago
3.0 - 6.0 years
11 - 15 Lacs
Madurai, Tiruppur, Salem
Work from Office
Req ID: 126281 Remote Position: No Region: Asia Country: India State/Province: Chennai City: Guindy, Chennai General Overview Functional Area: Engineering Career Stream: Design Software Engineering Job Code: SSE-ENG-DSE Job Level: Level 11 IC/MGR: Individual Contributor Direct/Indirect Indicator: Indirect Summary Celestica is looking for skilled and enthusiastic software engineers to join our team in developing cutting-edge data centers that leverage advanced GPU technologies In this dynamic role, you will build orchestration software for the entire rack, develop integrated visualization tools for rack components, and create comprehensive diagnostics to optimize GPU server utilization The ideal candidate will have a strong background in Orchestration Software development and experience creating solutions for the data center industry, Detailed Description Architect/Develop a full stack application to ease the task of designing, deploying and monitoring a next generation data centre including GPU/AI compute elements, Use Cloud Native Development Methods to support Kubernetes deployments for different scenarios, Build Template Driven Rack Design techniques to support various Rack element compositions, Scalable software that can gather data from a large number of devices, monitor and make it easy to visualize trends, Build Network validation techniques for GPU centric traffic patterns, Agile software that can react immediately to operational issues and self-heal the deployments Optimize code for performance, efficiency, scalability Adopt GenAI tools for development efficiency, Work effectively in a team environment, collaborating with engineers and peer functional leads from different disciplines to innovate solutions, triage issues and speed execution Mentor and coach team members on the technical skills and approaches to solve problems, Present innovation and value addition from our software in technical forums and customer interactions Knowledge/Skills/Competencies Strong programming skills: Extensive Programming in Python , Go, Database system knowledge: Experience with SQL database like Postgres SQL and NoSQL databases like MongoDB , TSDB like Prometheus, Kubernetes Deployment Skills : Experience in Container orchestration, pod health checks, Networking, Helmcharts, Deployment Strategies, Familiar with UI Frameworks Rest API Frameworks and Backend for Frontend Design methodologies, Debugging and testing skills: Ability to identify and resolve software issues, Problem-solving skills: Strong analytical and problem-solving abilities Experience with data center deployments: Prior experience in data center architectures, developing and maintaining software for deployments is a must, Clear Communication: Proven ability to articulate requirements and vision to large and diverse audience through written documents like architecture specifications and verbal presentations in technical forums is required, Physical Demands Duties of this position are performed in a normal office environment, Duties may require extended periods of sitting and sustained visual concentration on a computer monitor or on numbers and other detailed data, Repetitive manual movements (e-g , data entry, using a computer mouse, using a calculator, etc ) are frequently required, Occasional travel may be required, Typical Experience 12 to 18 years Typical Education Bachelor degree or consideration of an equivalent combination of education and experience, Educational Requirements may vary by Geography Notes This job description is not intended to be an exhaustive list of all duties and responsibilities of the position Employees are held accountable for all duties of the job Job duties and the % of time identified for any function are subject to change at any time, Celestica is an equal opportunity employer All qualified applicants will receive consideration for employment and will not be discriminated against on any protected status (including race, religion, national origin, gender, sexual orientation, age, marital status, veteran or disability status or other characteristics protected by law), At Celestica we are committed to fostering an inclusive, accessible environment, where all employees and customers feel valued, respected and supported Special arrangements can be made for candidates who need it throughout the hiring process Please indicate your needs and we will work with you to meet them, Company Overview Celestica (NYSE, TSX: CLS) enables the worlds best brands Through our recognized customer-centric approach, we partner with leading companies in Aerospace and Defense, Communications, Enterprise, HealthTech, Industrial, Capital Equipment and Energy to deliver solutions for their most complex challenges As a leader in design, manufacturing, hardware platform and supply chain solutions, Celestica brings global expertise and insight at every stage of product development from drawing board to full-scale production and after-market services for products from advanced medical devices, to highly engineered aviation systems, to next-generation hardware platform solutions for the Cloud Headquartered in Toronto, with talented teams spanning 40+ locations in 13 countries across the Americas, Europe and Asia, we imagine, develop and deliver a better future with our customers, Celestica would like to thank all applicants, however, only qualified applicants will be contacted, Celestica does not accept unsolicited resumes from recruitment agencies or fee based recruitment services,
Posted 2 months ago
8.0 - 13.0 years
25 - 30 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
At 5X our top priority is now to build out the platform Picture every company using 5X as getting a core stack (Data Ingestion, Warehouse, Modelling & Orchestration and BI) out of the box with user permissions and Utilization insights We are looking for Data Engineers with with a proven track record of successfully delivering end-to-end data solutions This is a full time role, Responsibilities Lead the analysis of client's business requirements to deliver strategic data solutions that align with organizational goals Oversee the design and development of data models and advanced Power BI dashboards, ensuring actionable insights for senior business stakeholders Drive the implementation and optimization of complex data models in Power BI, leveraging Medallion architecture principles (bronze, silver, gold layers) to ensure scalability and performance Manage end-to-end project delivery, including defining project scope, timelines, and milestones, while ensuring alignment with client expectations and business objectives Establish and enforce best practices in data engineering, ensuring high-quality, reliable, and scalable data solutions that meet industry standards Proactively communicate project updates, risks, and resolutions to stakeholders, driving transparency and trust Lead and mentor a team of data engineers and analysts, fostering a culture of collaboration, innovation, and accountability Facilitate high-level stakeholder conversations, translating technical concepts into business value and managing expectations to ensure successful project outcomes Qualifications 6+ years of experience in data engineering, analytics, or related fields, with a proven track record of leading and delivering complex, end-to-end data solutions Demonstrated project management experience, ensure timely delivery, and manage resources effectively Proven leadership skills, with experience managing and mentoring high-performing teams to achieve project and organizational goals Strong stakeholder management skills, with the ability to engage senior leaders, build relationships, and communicate complex technical concepts clearly Advanced proficiency in SQL and Python for data manipulation, transformation, and analysis Extensive experience with Snowflake and Power BI, including advanced dashboard development and optimization Deep expertise in medallion data architecture (Bronze, Silver, Gold layers) and advanced data modeling techniques Advanced knowledge of DAX calculations and measures for sophisticated analytics and performance optimization Exceptional problem-solving skills, with the ability to navigate ambiguity and drive solutions in a fast-paced environment Outstanding communication and interpersonal skills, with a track record of proactive client engagement and stakeholder management Self-motivated and results-oriented, with a demonstrated ability to unblock challenges, drive progress, and deliver with minimal supervision Benefits 100% remote company We love to give our employees the freedom to choose where they want to work from Wellness We have monthly wellness programmes & workshops to make sure that all our employees are happy and satisfied Competitive compensation -We offer competitive compensation and meaningful equity Parental Leave: We value and support the family planning process and we provide paid parental leave Healthcare We cover all employeeshealth benefits and their dependents Offsite1 team offsite a year to incredible destinations Check out our recent offsites here: Thailand, Sri Lanka and Bali About 5X 5X is a data and AI platform focused on traditional industries (like Banking, Manufacturing, Retail, Real Estate, Healthcare, Education,Government) Traditional businesses are struggling with data silos, poor data quality slowing the entire business now These businesses are using legacy hard to reach systems or use platforms like SAP, Salesforce, Oracle which make it complex to get data out, 5X is able to extract data from hard to reach systems, centralize, clean, structure & model it and enable data & agentic Gen AI capabilities, Unlike legacy data platform implementations which take months and hundreds of thousands of dollars we are able to demonstrate end to end use cases in 48 hours at a fraction of the price 5X was founded in 2020 with presence in the USA, Singapore, UK and India Our global team is 70+ people strong and rapidly growing, We're backed by Flybridge Capital and creators of popular open source projects like Airflow, Superset, Parquet & founders from companies like Datadog, Astronomer, Mode, Rudderstack Know About The Company Website: https://5X co/ LinkedIn: https: / / linkedin , / company / datawith5X / mycompany / Glassdoor: https://glassdoor co in/Reviews/5X-Reviews-E6110869 htm 5X in 2 minutes: https: / / youtube , / watchv=45Ppi00Lw70
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40419 Jobs | Dublin
Wipro
19673 Jobs | Bengaluru
Accenture in India
18234 Jobs | Dublin 2
EY
16675 Jobs | London
Uplers
12161 Jobs | Ahmedabad
Amazon
10909 Jobs | Seattle,WA
Accenture services Pvt Ltd
10500 Jobs |
Bajaj Finserv
10207 Jobs |
Oracle
9771 Jobs | Redwood City
IBM
9641 Jobs | Armonk