Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 18 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities:- Expected to be a SME with deep knowledge and experience.- Should have influencing and Advisory skills.- Engage with multiple teams and responsible for team decisions.- Expected to provide solutions to problems that apply across multiple teams.- Provide solutions to business area problems.- Lead the application development team in designing, building, and configuring applications.- Act as the primary point of contact for all application-related queries.- Collaborate with various teams to ensure successful project delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and ETL processes.- Experience in leading application development projects.- Knowledge of SAP BusinessObjects tools and technologies.- Hands-on experience in configuring and optimizing applications. Additional Information:- The candidate should have a minimum of 18 years of experience in SAP BusinessObjects Data Services.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 week ago
7.0 - 12.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Teradata BI Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop solutions that align with business needs and requirements. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the team in implementing innovative solutions- Conduct regular team meetings to ensure alignment and progress- Stay updated on industry trends and technologies to enhance team performance Professional & Technical Skills: - Must To Have Skills: Proficiency in Teradata BI- Strong understanding of data warehousing concepts- Experience in ETL processes and data modeling- Knowledge of SQL and database management- Hands-on experience in developing BI solutions Additional Information:- The candidate should have a minimum of 7.5 years of experience in Teradata BI- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices within the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture to support data initiatives.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality.- Strong understanding of data integration techniques and ETL processes.- Experience with data quality assessment and improvement methodologies.- Familiarity with data governance principles and best practices.- Ability to work with large datasets and perform data cleansing. Additional Information:- The candidate should have minimum 3 years of experience in Informatica Data Quality.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
2.0 - 5.0 years
7 - 11 Lacs
Pune
Work from Office
Key Responsibilities Process Improvement & Automation Identify repetitive and manual processes ripe for automation across reporting, documentation, and administrative workflows Lead initiatives to streamline operational tasks and improve capacity management Collaborate with cross-functional teams to implement scalable efficiency solutions Reporting & Analytics Own monthly reporting cycles, including dashboards, benchmarking, and methodology-based tracking Proactively refine reporting methods for better transparency and time savings Document & Site Management Maintain and manage SharePoint and Sphere sites, including user permissions, file structures, and content consistency Support migration planning and execution from legacy platforms (Box) to MyDocs (DocShare) Policy Alignment Cycles (PAC) Coordinate PAC verification and documentation efforts twice annually, ensuring change-tracking and comment logging are consistent Business Impact Assessment (BIA) Support Verify and update application lists and organizational structures by location Ensure data integrity and audit-readiness of structured documentation Required Skills & Qualifications Strong eye for identifying automation opportunities and driving operational efficiency Highly organized, detail-oriented, and independently driven Familiarity with platforms like SharePoint, DocShare, Sphere, and other enterprise documentation tools Competence in reporting tools and dashboard creation (Excel, Power BI preferred) Excellent communication and collaboration skills
Posted 1 week ago
10.0 - 15.0 years
20 - 25 Lacs
Bengaluru
Hybrid
Job Title: Senior Ab Initio Developer Company: Xebia Location: Whitefield, Bangalore (Hybrid 3 days/week from office) Experience: 8 to 12 years Joiners: Immediate to 2 weeks' notice only Job Description: We are hiring a Senior Ab Initio Developer to deliver robust ETL solutions for large-scale programs. This hybrid role allows you to work onsite 3 days/week at our Whitefield, Bangalore office . Key Responsibilities & Skills: ~10 years total ETL experience, with 5+ years in Ab Initio Expertise in Ab Initio components Join, Reformat, Normalize, etc. Strong SQL , Oracle , Unix scripting Familiarity with CA7 scheduler or similar Working knowledge of IBM Sterling Integrator & File Gateway Knowledge of encryption protocols (PGP/GPG, AES, Blowfish) Agile delivery, integration testing, and performance tuning experience Proven leadership and change control skills To Apply: Send your updated CV to vijay.s@xebia.com with these details: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Xebia Location (Whitefield, Bangalore) Notice Period / Last Working Day Primary Skills LinkedIn Profile Work Mode : Hybrid (3 days/week in Whitefield office) Joiners : Immediate to 2 weeks only Lets build something impactful together at Xebia !
Posted 2 weeks ago
7.0 - 10.0 years
20 - 35 Lacs
Chennai
Work from Office
Experience : 7.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Hybrid (Chennai) Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Agile, Program Management, data infrastructure Forbes Advisor is Looking for: Program Manager Data Job Description: Were hiring a Program Manager to orchestrate complex, cross-functional data initiativesfrom revenue-pipeline automation to analytics product launches. You’ll be the connective tissue between Data Engineering, Analytics, RevOps, Product, and external partners, ensuring programs land on time, on scope, and with measurable impact. If you excel at turning vision into executable roadmaps, mitigating risk before it bites, and communicating clearly across technical and business audiences, we’d love to meet you. Key Responsibilities: Own program delivery for multi-team data products (e.g., revenue-data pipelines, attribution models, partner-facing reporting APIs). Build and maintain integrated roadmaps, aligning sprint plans, funding, and resource commitments. Drive agile ceremonies (backlog grooming, sprint planning, retrospectives) and track velocity, burn-down, and cycle-time metrics. Create transparent status reporting—risks, dependencies, OKRs—tailored for engineers up to C-suite stakeholders. Proactively remove blockers by coordinating with Platform, IT, Legal/Compliance, and external vendors. Champion process optimisation: intake, prioritisation, change management, and post-mortems. Partner with RevOps and Media teams to ensure program outputs translate into revenue growth and faster decision making. Facilitate launch readiness—QA checklists, enablement materials, go-live runbooks—so new data products land smoothly. Foster a culture of documentation, psychological safety, and continuous improvement within the data organisation. Experience required: 7+ years program or project-management experience in data, analytics, SaaS, or high-growth tech. Proven success delivering complex, multi-stakeholder initiatives on aggressive timelines. Expertise with agile frameworks (Scrum/Kanban) and modern collaboration tools (Jira, Asana, Notion/Confluence, Slack). Strong understanding of data & cloud concepts (pipelines, ETL/ELT, BigQuery, dbt, Airflow/Composer). Excellent written and verbal communication—able to translate between technical teams and business leaders. Risk-management mindset: identify, quantify, and drive mitigation before issues escalate. Experience coordinating across time zones and cultures in a remote-first environment. Nice to Have Formal certification (PMP, PMI-ACP, CSM, SAFe, or equivalent). Familiarity with GCP services, Looker/Tableau, or marketing-data stacks (Google Ads, Meta, GA4). Exposure to revenue operations, performance marketing, or subscription/affiliate business models. Background in change-management or process-improvement methodologies (Lean, Six Sigma). Perks: Monthly long weekends—every third Friday off. Fitness and commute reimbursement. Remote-first culture with flexible hours and a high-trust environment. Opportunity to shape a world-class data platform inside a trusted global brand. Collaborate with talented engineers, analysts, and product leaders who value innovation and impact. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! Forbes Advisor is a high-growth digital media and technology company that empowers consumers to make confident decisions about money, health, careers, and everyday life. Our global data organisation builds modern, AI-augmented pipelines that turn information into revenue-driving insight.
Posted 2 weeks ago
1.0 - 3.0 years
4 - 7 Lacs
Chennai
Work from Office
Design, develop, implement, and support data integration solutions, ensuring data quality and performance. Create reports and support application teams with database development to meet internal and client system needs.
Posted 2 weeks ago
3.0 - 6.0 years
3 - 6 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 Years of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using SnapLogic. Your typical day will involve working with the development team, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities:- Design, develop, and maintain SnapLogic integrations and workflows to meet business requirements.- Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements.- Develop and maintain technical documentation for SnapLogic integrations and workflows.- Troubleshoot and resolve issues with SnapLogic integrations and workflows. Professional & Technical Skills: - Must To Have Skills: Strong experience in SnapLogic.- Good To Have Skills: Experience in other ETL tools like Informatica, Talend, or DataStage.- Experience in designing, developing, and maintaining integrations and workflows using SnapLogic.- Experience in analyzing business requirements and developing solutions to meet those requirements.- Experience in troubleshooting and resolving issues with SnapLogic integrations and workflows. Additional Information:- The candidate should have a minimum of 5 years of experience in SnapLogic.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions using SnapLogic.- This position is based at our Pune office. Qualification 15 Years of full time education
Posted 2 weeks ago
2.0 - 5.0 years
6 - 10 Lacs
Mumbai
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 2 weeks ago
2.0 - 5.0 years
6 - 10 Lacs
Pune
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 2 weeks ago
3.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Technology-Data Management - Data Integration-Ab Initio Preferred Skills: Technology-Data Management - Data Integration-Ab Initio
Posted 2 weeks ago
5.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Primary skills:Technology-Data Management - Data Integration-Talend Preferred Skills: Technology-Data Management - Data Integration-Talend
Posted 2 weeks ago
3.0 - 5.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Technology-Data Management - Data Integration Administration-Informatica Administration Preferred Skills: Technology-Data Management - Data Integration Administration-Informatica Administration
Posted 2 weeks ago
9.0 - 11.0 years
10 - 13 Lacs
Pune
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Technology-Data on Cloud - Datastore-Cloud based Integration Platforms-Informatica Intelligent Cloud services(IICS) Preferred Skills: Technology-Data on Cloud - Datastore-Cloud based Integration Platforms-Informatica Intelligent Cloud services(IICS)
Posted 2 weeks ago
3.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,BCS,BBA,BCom,MCA,MSc Service Line Data & Analytics Unit Responsibilities " Has good knowledge on Snowflake Architecture. Understanding Virtual Warehouses - multi-cluster warehouse, autoscaling Metadata and system objects - query history, grants to users, grants to roles, users Micro-partitions Table Clustering, Auto Reclustering Materialized Views and benefits Data Protection with Time Travel in Snowflake - extremely imp Analyzing Queries Using Query Profile - extremely important (Explain plan) Cache architecture Virtual Warehouse(VW) Named Stages Direct Loading SnowPipe, Data Sharing,Streams, JavaScript Procedures & Tasks Strong ability to design and develop workflows in Snowflake in at least one cloud technology (preferably, AWS) Apply Snowflake programming and ETL experience to write Snowflake SQL and maintain complex, internally developed Reporting system. Preferable knowledge in ETL Activities like data processing from multiple source systems. Extensive Knowledge on Query Performance tuning. Apply knowledge of BI tools. Manage time effectively. Accurately estimate effortfortasks and meet agreed-upon deadlines. Effectively juggle ad-hocrequests and longer-term projects.Snowflake performance specialist- Familiar withzero copy cloningand usingtime travelfeatures to clone table- Familiar in understandingSnowflake query profileand what each step does andidentifying performance bottlenecks from query profile- Understanding of when a table needs to be clustered- Choosing the right cluster keyas a part of table design to help query optimization- Working with materialized views andbenefits vs cost scenario- How Snowflake micro partitions are maintained and what are the performance implications wrt micro partitions/ pruningetc- Horizontal vs vertical scaling. When to do what.Concept of multi cluster warehouse and autoscaling- Advanced SQL knowledge including window functions, recursive queriesand ability to understand and rewritecomplex SQLs as a part of performance optimization" Additional Responsibilities: Domain*Data Warehousing, Business IntelligencePrecise Work LocationBhubaneswar, Bangalore, Hyderabad, Pune Technical and Professional : Mandatory skills*SnowflakeDesired skills*Teradata/Python(Not Mandatory) Preferred Skills: Cloud Platform-Snowflake Technology-OpenSystem-Python - OpenSystem
Posted 2 weeks ago
3.0 - 5.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Technology-Data Management - MDM-Informatica MDM Preferred Skills: Technology-Data Management - MDM-Informatica MDM Technology-Data Management - MDM-Informatica PIM
Posted 2 weeks ago
5.0 - 10.0 years
5 - 15 Lacs
Hyderabad
Work from Office
We are looking for a highly skilled Senior Database Developer who can work independently under limited supervision and apply their expertise in database design, development, and maintenance. This role requires a strong background in SQL, relational databases, and data modelling, with a focus on optimizing performance and supporting business intelligence capabilities. Responsibilities: Provide strategic direction and guidance for enterprise data architecture that supports Business Intelligence capabilities. Design and develop conceptual, logical, and physical data models, ensuring optimal performance, scalability, and maintainability. Use profiling tools to identify slow or resource-intensive SQL queries and develop solutions to improve performance. Focus on performance tuning, especially for complex queries, stored procedures, and indexing strategies. Design and implement new features with a focus on scalability and maintainability. Document and define data modeling requirements, ensuring that the applications database design aligns with technical and functional specifications. Ensure significant database design decisions are communicated and validated, adhering to best practices. Ensure the long-term reliability, scalability, and maintainability of systems. Collaborate with cross-functional teams to gather requirements and implement solutions. Assist in the adoption and application of industry best practices and guidelines. Qualifications: Educational Background: Bachelor’s degree or higher in Information Systems, Computer Science, or a related field (or equivalent experience). Experience: 5+ years of experience as a SQL Server Database Developer or Database Administrator (DBA). Technical Skills: Strong expertise in SQL and experience in writing complex SQL queries. Hands-on experience with SQL-XML programming. Extensive experience with SQL Server (Microsoft) and database architectures. Familiarity with performance tuning of SQL queries, stored procedures, and indexing strategies. Knowledge of Data Profiling Tools and performance optimization (CPU/memory/I/O concerns). Experience with Data Modelling and Database Design. Knowledge of ETL tools like Pentaho is a plus. Programming skills in Java and a willingness to explore new languages or transition into a Full-stack Engineer role. Experience with Agile methodologies (preferably SCRUM) and quick delivery through release management. Soft Skills: Strong attention to detail and results-oriented approach. Passionate, intelligent, and a critical thinker with excellent problem-solving skills. Ability to thrive in a fast-paced environment with multiple ongoing projects. Excellent written and verbal communication skills. Collaborative mindset, with the ability to work with all levels of management and stakeholders. Desired Traits: Self-motivated, technical, results-oriented, and quality-focused individual. Strong data warehouse and architecture skills. Excellent problem-solving abilities, proactive with a focus on delivering business value. A team player who is detail-oriented, respectful, and thoughtful.
Posted 2 weeks ago
6.0 - 8.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Role & responsibilities Mandate skills- Python, AWS, Data Modeler, SQL and Devops (Good to Have not Mandate) Please avoid candidates Qualified from the Universities of Hyderabad & Telangana You Can consider Hyderabad and Telangana Candidates those are working in only tier one companies Job Description- The ideal candidate will have 6 to 8 years of experience in data modelling and architecture with deep expertise in Python, AWS cloud stack , data warehousing , and enterprise data modelling tools . This individual will be responsible for designing and creating enterprise-grade data models and driving the implementation of Layered Scalable Architecture or Medallion Architecture to support robust, scalable, and high-quality data marts across multiple business units. This role will involve managing complex datasets from systems like PoS, ERP, CRM, and external sources, while optimizing performance and cost. You will also provide strategic leadership on data modelling standards, governance, and best practices, ensuring the foundation for analytics and reporting is solid and future ready. Key Responsibilities: Design and deliver conceptual, logical, and physical data models using tools like ERWin . Implement Layered Scalable Architecture / Medallion Architecture for building scalable, standardized data marts. Optimize performance and cost of AWS-based data infrastructure (Redshift, S3, Glue, Lambda, etc.). Collaborate with cross-functional teams (IT, business, analysts) to gather data requirements and ensure model alignment with KPIs and business logic. Develop and optimize SQL code, materialized views, stored procedures in AWS Redshift . Ensure data governance, lineage, and quality mechanisms are established across systems. Lead and mentor technical teams in an Agile project delivery model. Manage data layer creation and documentation: data dictionary, ER diagrams, purpose mapping. Identify data gaps and availability issues with respect to source systems. Required Skills & Qualifications: Bachelors or Masters degree in Computer Science, IT, or related field (B.E./B.Tech/M.E./M.Tech/MCA) . Minimum 8 years of experience in data modeling and architecture. Proficiency with data modeling tools such as ERWin , with strong knowledge of forward and reverse engineering . Deep expertise in SQL (including advanced SQL, stored procedures, performance tuning). Strong experience in Python, data warehousing , RDBMS , and ETL tools like AWS Glue , IBM DataStage , or SAP Data Services . Hands-on experience with AWS services : Redshift, S3, Glue, RDS, Lambda, Bedrock, and Q. Good understanding of reporting tools such as Tableau , Power BI , or AWS QuickSight . Exposure to DevOps/CI-CD pipelines , AI/ML , Gen AI , NLP , and polyglot programming is a plus. Familiarity with data governance tools (e.g., ORION/EIIG). Domain knowledge in Retail , Manufacturing , HR , or Finance preferred. Excellent written and verbal communication skills. Certifications (Preferred) Good to have AWS Certification (e.g., AWS Certified Solutions Architect or Data Analytics Specialty ) Data Governance or Data Modelling Certifications (e.g., CDMP , Databricks , or TOGAF ) Mandatory Skills Python, AWS, Technical Architecture, AIML, SQL, Data Warehousing, Data Modelling Preferred candidate profile Share resumes on Sakunthalaa@valorcrest.in
Posted 2 weeks ago
7.0 - 12.0 years
22 - 37 Lacs
Bengaluru, Mumbai (All Areas)
Hybrid
Hiring: Data Engineering Senior Software Engineer / Tech Lead / Senior Tech Lead - Mumbai & Bengaluru - Hybrid (3 Days from office) | Shift: 2 PM 11 PM IST - Experience: 5 to 12+ years (based on role & grade) Open Grades/Roles : Senior Software Engineer : 58 Years Tech Lead : 710 Years Senior Tech Lead : 10–12+ Years Job Description – Data Engineering Team Core Responsibilities (Common to All Levels) : Design, build and optimize ETL/ELT pipelines using tools like Pentaho , Talend , or similar Work on traditional databases (PostgreSQL, MSSQL, Oracle) and MPP/modern systems (Vertica, Redshift, BigQuery, MongoDB) Collaborate cross-functionally with BI, Finance, Sales, and Marketing teams to define data needs Participate in data modeling (ER/DW/Star schema) , data quality checks , and data integration Implement solutions involving messaging systems (Kafka) , REST APIs , and scheduler tools (Airflow, Autosys, Control-M) Ensure code versioning and documentation standards are followed (Git/Bitbucket) Additional Responsibilities by Grade Senior Software Engineer (5–8 Yrs) : Focus on hands-on development of ETL pipelines, data models, and data inventory Assist in architecture discussions and POCs Good to have: Tableau/Cognos, Python/Perl scripting, GCP exposure Tech Lead (7–10 Yrs) : Lead mid-sized data projects and small teams Decide on ETL strategy (Push Down/Push Up) and performance tuning Strong working knowledge of orchestration tools, resource management, and agile delivery Senior Tech Lead (10–12+ Yrs) : Drive data architecture , infrastructure decisions , and internal framework enhancements Oversee large-scale data ingestion, profiling, and reconciliation across systems Mentoring junior leads and owning stakeholder delivery end-to-end Advantageous: Experience with AdTech/Marketing data , Hadoop ecosystem (Hive, Spark, Sqoop) - Must-Have Skills (All Levels): ETL Tools: Pentaho / Talend / SSIS / Informatica Databases: PostgreSQL, Oracle, MSSQL, Vertica / Redshift / BigQuery Orchestration: Airflow / Autosys / Control-M / JAMS Modeling: Dimensional Modeling, ER Diagrams Scripting: Python or Perl (Preferred) Agile Environment, Git-based Version Control Strong Communication and Documentation
Posted 2 weeks ago
4.0 - 6.0 years
6 - 8 Lacs
Bengaluru
Work from Office
About the Role: We are seeking a skilled and detail-oriented Data Migration Specialist with hands-on experience in Alteryx and Snowflake. The ideal candidate will be responsible for analyzing existing Alteryx workflows, documenting the logic and data transformation steps and converting them into optimized, scalable SQL queries and processes in Snowflake. The ideal candidate should have solid SQL expertise, a strong understanding of data warehousing concepts. This role plays a critical part in our cloud modernization and data platform transformation initiatives. Key Responsibilities: Analyze and interpret complex Alteryx workflows to identify data sources, transformations, joins, filters, aggregations, and output steps. Document the logical flow of each Alteryx workflow, including inputs, business logic, and outputs. Translate Alteryx logic into equivalent SQL scripts optimized for Snowflake, ensuring accuracy and performance. Write advanced SQL queries , stored procedures, and use Snowflake-specific features like Streams, Tasks, Cloning, Time Travel , and Zero-Copy Cloning . Implement data ingestion strategies using Snowpipe , stages, and external tables. Optimize Snowflake performance through query tuning , partitioning, clustering, and caching strategies. Collaborate with data analysts, engineers, and stakeholders to validate transformed logic against expected results. Handle data cleansing, enrichment, aggregation, and business logic implementation within Snowflake. Suggest improvements and automation opportunities during migration. Conduct unit testing and support UAT (User Acceptance Testing) for migrated workflows. Maintain version control, documentation, and audit trail for all converted workflows. Required Skills: Bachelors or masters degree in computer science, Information Technology, Data Science, or a related field. Must have aleast 4 years of hands-on experience in designing and developing scalable data solutions using the Snowflake Data Cloud platform Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. 1+ years of experience with Alteryx Designer, including advanced workflow development and debugging. Strong proficiency in SQL, with 3+ years specifically working with Snowflake or other cloud data warehouses. Python programming experience focused on data engineering. Experience with data APIs , batch/stream processing. Solid understanding of data transformation logic like joins, unions, filters, formulas, aggregations, pivots, and transpositions. Experience in performance tuning and optimization of SQL queries in Snowflake. Familiarity with Snowflake features like CTEs, Window Functions, Tasks, Streams, Stages, and External Tables. Exposure to migration or modernization projects from ETL tools (like Alteryx/Informatica) to SQL-based cloud platforms. Strong documentation skills and attention to detail. Experience working in Agile/Scrum development environments. Good communication and collaboration skills.
Posted 2 weeks ago
8.0 - 10.0 years
10 - 12 Lacs
Hyderabad
Work from Office
Details of the role: 8 to 10 years experience as Informatica Admin (IICS) Key responsibilities: Understand the programs service catalog and document the list of tasks which has to be performed for each Lead the design, development, and maintenance of ETL processes to extract, transform, and load data from various sources into our data warehouse. Implement best practices for data loading, ensuring optimal performance and data quality. Utilize your expertise in IDMC to establish and maintain data governance, data quality, and metadata management processes. Implement data controls to ensure compliance with data standards, security policies, and regulatory requirements. Collaborate with data architects to design and implement scalable and efficient data architectures that support business intelligence and analytics requirements. Work on data modeling and schema design to optimize database structures for ETL processes. Identify and implement performance optimization strategies for ETL processes, ensuring timely and efficient data loading. Troubleshoot and resolve issues related to data integration and performance bottlenecks. Collaborate with cross-functional teams, including data scientists, business analysts, and other engineering teams, to understand data requirements and deliver effective solutions. Provide guidance and mentorship to junior members of the data engineering team. Create and maintain comprehensive documentation for ETL processes, data models, and data flows. Ensure that documentation is kept up-to-date with any changes to data architecture or ETL workflows. Use Jira for task tracking and project management. Implement data quality checks and validation processes to ensure data integrity and reliability. Maintain detailed documentation of data engineering processes and solutions. Required Skills: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Senior ETL Data Engineer, with a focus on IDMC / IICS Strong proficiency in ETL tools and frameworks (e.g., Informatica Cloud, Talend, Apache NiFi). Expertise in IDMC principles, including data governance, data quality, and metadata management. Solid understanding of data warehousing concepts and practices. Strong SQL skills and experience working with relational databases. Excellent problem-solving and analytical skills.
Posted 2 weeks ago
1.0 - 5.0 years
3 - 7 Lacs
Mumbai
Work from Office
We are looking for a highly skilled and experienced Senior Analyst to join our team at eClerx Services Ltd. The ideal candidate will have a strong background in IT Services & Consulting, with excellent analytical and problem-solving skills. Roles and Responsibility Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and implement process improvements to increase efficiency and productivity. Analyze complex data sets to inform business decisions and drive growth. Provide expert-level support for data analysis and reporting. Identify and mitigate risks associated with data quality and integrity. Develop and maintain technical documentation for processes and procedures. Job Requirements Strong understanding of IT Services & Consulting industry trends and technologies. Excellent analytical and problem-solving skills, with attention to detail. Ability to work collaboratively in a fast-paced environment. Strong communication and interpersonal skills, with the ability to present complex ideas simply. Experience with data analysis and reporting tools, such as Excel or SQL. Ability to adapt to changing priorities and deadlines in a dynamic environment. About Company eClerx Services Ltd. is a leading provider of IT Services & Consulting solutions, committed to delivering exceptional results and exceeding client expectations.
Posted 2 weeks ago
0.0 - 3.0 years
2 - 6 Lacs
Mohali
Work from Office
We are looking for a highly skilled and experienced Analyst to join our team at eClerx Services Ltd. The ideal candidate will have a strong background in IT Services & Consulting, with excellent analytical skills and attention to detail. Roles and Responsibility Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain complex data analysis systems and reports. Provide expert-level support for data analysis and reporting needs. Identify trends and patterns in large datasets to inform business decisions. Develop and implement process improvements to increase efficiency and productivity. Communicate findings and insights to stakeholders through clear and concise reports. Job Requirements Strong understanding of data analysis principles and techniques. Proficiency in data visualization tools and software. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment with multiple priorities. Strong problem-solving skills and attention to detail. Experience working with large datasets and developing complex reports. Title: Analyst, ref: 78642.
Posted 2 weeks ago
1.0 - 5.0 years
3 - 7 Lacs
Chandigarh
Work from Office
We are looking for a highly skilled and experienced Senior Analyst to join our team at eClerx Services Ltd. The ideal candidate will have a strong background in IT Services & Consulting, with excellent analytical and problem-solving skills. Roles and Responsibility Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain complex data models and reports using various tools and technologies. Analyze large datasets to extract insights and trends, and provide recommendations to stakeholders. Design and implement process improvements to increase efficiency and productivity. Provide expert-level support for data analysis and reporting, ensuring high-quality results. Stay up-to-date with industry trends and emerging technologies to continuously improve skills and knowledge. Job Requirements Strong understanding of IT Services & Consulting principles and practices. Excellent analytical and problem-solving skills, with the ability to think critically and creatively. Proficiency in data modeling and reporting tools, with experience working with large datasets. Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams. Ability to prioritize tasks and manage multiple projects simultaneously, demonstrating strong time management skills. Experience with process improvement initiatives, focusing on increasing efficiency and productivity.
Posted 2 weeks ago
1.0 - 3.0 years
2 - 3 Lacs
Bengaluru
Work from Office
Job Title: Analyst Intern Storefront Team Location: Bangalore, India Duration: 3-6 months Team: Storefront (Product & Experience) About Udaan 2.0 Udaan is Myntras initiative specifically designed to offer a career launchpad to people with disabilities. It is a six month paid internship that ensures a conducive environment facilitating a smooth transition to work. With structured on-boarding, customized learning and development programs, mentorship opportunities, on the job learning and best in class benefits, we aim to provide an environment that is supportive, so that you can thrive and build your career with us. As a part of our commitment towards diversity and inclusion, through this program, we strive to create a culture where all can belong and bring their experiences and authentic selves to work every day. During your internship with us, you will get the opportunity to work with the best talent in the e-commerce industry and work on projects that match your interest, abilities and could lead to full-time employment with Myntra. About Myntra: Myntra is Indias leading fashion and lifestyle e-commerceplatform, known for delivering a personalized and engaging shopping experienceto millions. Our Storefront team plays a pivotal role in crafting the userjourney and ensuring every touchpoint on the app/website drives discovery,engagement, and conversion. Role Overview: We are looking for a data-driven and curious Analyst Internto join the Storefront Team. You will work closely with product managers,designers, engineers, and marketing teams to analyze platform data, buildperformance dashboards, derive insights, and contribute to optimizationexperiments across the customer funnel. Key Responsibilities: Analyze customer behavior across key Storefront surfaces like homepage, PLP, PDP and navigation. Create and maintain dashboards to track KPIs such as click-through rate (CTR),conversion rate, engagement time, and bounce rate. Partner with product and design teams to measure A/B test performance and interpret results. Conduct root cause analysis for performance dips or changes in user patterns. Identify growth opportunities and generate hypotheses for UX, content, or merchandising enhancements. Prepare weekly reports and business review decks for leadership consumption. Qualifications: Pursuing or recently completed a Bachelors or Masters degree in Engineering, Statistics, Mathematics, Economics, or related fields. Strong proficiency in SQL and Excel; familiarity with data visualization tools like Tableau/PowerBI preferred. Exposure to Python/R for data analysis is a plus. Excellent analytical and problem-solving skills with attention to detail. Ability to work in a fast-paced, collaborative environment.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France