Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
30 - 45 Lacs
Hyderabad, Gurugram, Bengaluru
Work from Office
Senior Data Modeller Telecom Domain Job Location: Anywhere in India ( Preferred location - Gurugram , Noida , Hyderabad , Bangalore ) Experience: 7+ Years Domain: Telecommunications Job Summary: We are hiring a Senior Data Modeller with strong telecom domain expertise. You will design and standardize enterprise-wide data models across domains like Customer, Product, Billing, and Network, ensuring alignment with TM Forum standards (SID, eTOM). You'll collaborate with cross-functional teams to translate business needs into scalable, governed data structures, supporting analytics, ML, and digital transformation. Key Responsibilities: Design logical/physical data models for telecom domains Align models with TM Forum SID, eTOM, ODA, and data mesh principles Develop schemas (normalized, Star, Snowflake) based on business needs Maintain data lineage, metadata, and version control Collaborate with engineering teams on Azure, Databricks implementations Tag data for privacy, compliance (GDPR), and data quality Required Skills: 7+ years in data modelling, 3+ years in telecom domain Proficient in TM Forum standards and telecom business processes Hands-on with data modeling tools (SSAS, dbt, Informatica) Expertise in SQL, metadata documentation, schema design Cloud experience: Azure Synapse, Databricks, Snowflake Experience in CRM, billing, network usage, campaign data models Familiar with data mesh, domain-driven design, and regulatory frameworks Education: Bachelors or Masters in CS, Telecom Engineering, or related field Please go the JD and If you are interested, kindly share your updated resume along with the following details:Few bullet points on Current CTC (fixed plus variable) Offer in hand (fixed plus variable) Expected CTC Notice period Few points on relevant skills and experience Email: sp@intellisearchonline.net
Posted 1 week ago
3.0 - 8.0 years
9 - 19 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Snowflake Developer
Posted 1 week ago
2.0 - 7.0 years
11 - 16 Lacs
Gurugram
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Conducting data analysis using SQL and Python to extract insights from large data sets Conducting exploratory data analysis to identify trends, patterns, and insights from data Developing AI/ML models and algorithms to automate and optimize business processes Staying up-to-date with the latest advancements in AI/ML techniques and tools and identifying opportunities to apply them to enhance existing solutions Documenting and communicating findings, methodologies, and insights to technical and non-technical stakeholders Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelors degree in Computer Science, Statistics, or related field 2+ years of experience in SQL, Python, and Snowflake Experience with exploratory data analysis and generating insights from data Knowledge of machine learning algorithms and techniques Proven solid problem-solving skills and attention to detail Proven excellent communication and collaboration skills Proven ability to work in a fast-paced environment and manage multiple projects simultaneously At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyoneof every race, gender, sexuality, age, location and incomedeserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes an enterprise priority reflected in our mission. #Nic
Posted 1 week ago
4.0 - 9.0 years
9 - 14 Lacs
Hyderabad
Work from Office
As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities: Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure Design and develop Azure Databricks processes using PySpark/Spark-SQL Design and develop orchestration jobs using ADF, Databricks Workflow Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions etc. Building test framework for the Databricks notebook jobs for automated testing before code deployment Design and build POCs to validate new ideas, tools, and architectures in Azure Continuously explore new Azure services and capabilities; assess their applicability to business needs. Create detailed documentation for cloud processes, architecture, and implementation patterns Work with data & analytics team to build and deploy efficient data engineering processes and jobs on Azure cloud Prepare case studies and technical write-ups to showcase successful implementations and lessons learned Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Ensure solutions adhere to security, compliance, and governance standards Monitor and optimize data pipelines and cloud resources for cost and performance efficiency Identifies solutions to non-standard requests and problems Support and maintain the self-service BI warehouse Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience 4+ years of overall experience in Data & Analytics engineering 4+ years of experience working with Azure, Databricks, and ADF, Data Lake Solid experience working with data platforms and products using PySpark and Spark-SQL Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. In-depth understanding of Azure architecture & ability to come up with efficient design & solutions Highly proficient in Python and SQL Proven excellent communication skills Preferred Qualifications Snowflake, Airflow experience Power BI development experience Experience or knowledge of health care concepts – E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. #NIC External Candidate Application Internal Employee Application
Posted 1 week ago
4.0 - 9.0 years
15 - 21 Lacs
Bengaluru
Work from Office
About Zscaler Serving thousands of enterprise customers around the world including 45% of Fortune 500 companies, Zscaler (NASDAQ: ZS) was founded in 2007 with a mission to make the cloud a safe place to do business and a more enjoyable experience for enterprise users. As the operator of the world’s largest security cloud, Zscaler accelerates digital transformation so enterprises can be more agile, efficient, resilient, and secure. The pioneering, AI-powered Zscaler Zero Trust Exchange™ platform, which is found in our SASE and SSE offerings, protects thousands of enterprise customers from cyberattacks and data loss by securely connecting users, devices, and applications in any location. Named a Best Workplace in Technology by Fortune and others, Zscaler fosters an inclusive and supportive culture that is home to some of the brightest minds in the industry. If you thrive in an environment that is fast-paced and collaborative, and you are passionate about building and innovating for the greater good, come make your next move with Zscaler. We built the Zscaler architecture from the ground up as a platform that could extend to new features and services. Our Product Management team takes hold of this massive opportunity to deliver our customers a growing portfolio of never-before-seen capabilities in threat prevention, visibility, scalability, and business enablement. Our product managers are champions of innovation with a shared vision for Zscaler and the limitless possibilities of cloud security. Join us to make your mark on the planning and product roadmap at the forefront of the world's cloud security leader. We are looking for an Education Operations Specialist with analytics experience who will be reporting into Platform Training Operations Manager. You will be supporting various cross-functional teams within Zscaler, such as the Partner Technical Enablement Team, Demo & Labs Team, and other key stakeholders. In this role you will be responsible for: Operating as part of the global Platform Training and Certification team and contribute to the tier-1 support of the Partner Academy Program and our demo platform requiring adaptable hours to US time zones Analyzing data to answer key questions for stakeholders or yourself, with an eye on what drives business performance, and investigate and communicate which areas need improvement in efficiency and productivity Assisting with and create rich interactive visualizations through data interpretation and analysis, with reporting components from multiple data sources Providing critical operations support for Technical Management, Business Development, Training, and Curriculum Development functions Assisting in the developmental operations processes as well as maintenance for new and existing initiatives to drive growth, certifications, and contribute to an expanded operations role What We're Looking for (Minimum Qualifications) Bachelor's Degree in business, information technology, or similar Experience with project management 3+ years of experience mining data as a data analyst Experience with SQL with aptitude for learning other analytics tool Experience with project management and focused on delivering strategic solutions, coordinating with teams to improve processes in a scaling environment What Will Make You Stand Out (Preferred Qualifications) Proficiency with business productivity tools like GSuite, Asana, Tableau, Jira, Confluence, ServiceNow, and Salesforce Experience managing Asana or other work management platforms Experience with Salesforce data, Snowflake, database , model design and segmentation techniques #LI-Hybrid #LI-KM8 At Zscaler, we are committed to building a team that reflects the communities we serve and the customers we work with. We foster an inclusive environment that values all backgrounds and perspectives, emphasizing collaboration and belonging. Join us in our mission to make doing business seamless and secure. Our Benefits program is one of the most important ways we support our employees. Zscaler proudly offers comprehensive and inclusive benefits to meet the diverse needs of our employees and their families throughout their life stages, including: Various health plans Time off plans for vacation and sick time Parental leave options Retirement options Education reimbursement In-office perks, and more! By applying for this role, you adhere to applicable laws, regulations, and Zscaler policies, including those related to security and privacy standards and guidelines. Zscaler is committed to providing equal employment opportunities to all individuals. We strive to create a workplace where employees are treated with respect and have the chance to succeed. All qualified applicants will be considered for employment without regard to race, color, religion, sex (including pregnancy or related medical conditions), age, national origin, sexual orientation, gender identity or expression, genetic information, disability status, protected veteran status, or any other characteristic protected by federal, state, or local laws. See more information by clicking on the Know Your Rights: Workplace Discrimination is Illegal link. Pay Transparency Zscaler complies with all applicable federal, state, and local pay transparency rules. Zscaler is committed to providing reasonable support (called accommodations or adjustments) in our recruiting processes for candidates who are differently abled, have long term conditions, mental health conditions or sincerely held religious beliefs, or who are neurodivergent or require pregnancy-related support.
Posted 1 week ago
3.0 - 7.0 years
4 - 7 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
About the Role: We're hiring 2 Cloud & Data Engineering Specialists to join our fast-paced, agile team. These roles are focused on designing, developing, and scaling modern, cloud-based data engineering solutions using tools like Azure, AWS, GCP, Databricks, Kafka, PySpark, SQL, Snowflake, and ADF. Position 1: Cloud & Data Engineering Specialist Resource 1 Key Responsibilities: Develop and manage cloud-native solutions on Azure or AWS Build real-time streaming apps with Kafka Engineer services using Java and Python Deploy and manage Kubernetes-based containerized applications Process big data using Databricks Administer SQL Server and Snowflake databases, write advanced SQL Utilize Unix/Linux for system operations Must-Have Skills: Azure or AWS cloud experience Kafka, Java, Python, Kubernetes Databricks, SQL Server, Snowflake Unix/Linux commands Location- Remote, Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad, Mumbai, Hyderabad
Posted 1 week ago
5.0 - 10.0 years
20 - 25 Lacs
Mumbai
Work from Office
Entity :- Accenture Strategy & Consulting Team :- Strategy & Consulting Global Network Practice :- Marketing Analytics Title :- Data Science Manager Job location :- Gurgaon About S&C - Global Network :- Accenture Applied Intelligence practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition. WHATS IN IT FOR YOU As part of our Analytics practice, you will join a worldwide network of over 20,000 smart and driven colleagues experienced in leading statistical tools, methods and applications. From data to analytics and insights to actions, our forward-thinking consultants provide analytically-informed, issue-based insights at scale to help our clients improve outcomes and achieve high performance. Accenture will continually invest in your learning and growth. You'll work with MMM experts, and Accenture will support you in growing your own tech stack and certifications In Applied intelligence you will understands the importance of sound analytical decision-making, relationship of tasks to the overall project, and executes projects in the context of a business performance improvement initiative. What you would do in this role Working through the phases of project Define data requirements for creating a model and understand the business problem Clean, aggregate, analyze, interpret data and carry out quality analysis of it 5+ years of advanced experience of Market Mix Modeling and related concepts of optimizing promotional channels and budget allocation Experience in working with non linear optimization techniques. Proficiency in Statistical and Probabilistic methods such as SVM, Decision-Trees, Bagging and Boosting Techniques, Clustering Hands on experience in python data-science and math packages such as NumPy , Pandas, Sklearn, Seaborne, Pycaret, Matplotlib Development of AI/ML models Develop and Manage data pipelines Develop and Manage Data within different layers of Azure/ Snowflake Aware of common design patterns for scalable machine learning architectures, as well as tools for deploying and maintaining machine learning models in production. Knowledge of cloud platforms and usage for pipelining and deploying and scaling marketing mix models. Working knowledge of MMM optimizer and its intricacies Awareness of MMM application development and backend engine integration will be preferred Working along with the team and consultant/manager Well versed with creating insights presentations and client ready decks. Should be able to mentor and guide a team of 10-15 people under him/her Manage client relationships and expectations, and communicate insights and recommendations effectively Capability building and thought leadership Logical Thinking Able to think analytically, use a systematic and logical approach to analyze data, problems, and situations. Notices discrepancies and inconsistencies in information and materials. Task Management Advanced level of task management knowledge and experience. Should be able to plan own tasks, discuss and work on priorities, track and report progress Qualification Who we are looking for 5+ years of work experience in consulting/analytics with reputed organization is desirable. Master degree in Statistics/Econometrics/ Economics or B Tech/M Tech or Masters/M Tech in Computer Science or M.Phil/Ph.D in statistics/econometrics or related field from reputed college Must have knowledge of SQL and Python language and at-least one cloud-based technologies (Azure, AWS, GCP) Must have good knowledge of Market mix modeling techniques and optimization algorithms and applicability to industry data Must have data migration experience from cloud to snowflake (Azure, GCP, AWS) Managing sets of XML, JSON, and CSV from disparate sources. Manage documentation of data models, architecture, and maintenance processes Have an understanding of econometric/statistical modeling and analysis techniques such as regression analysis, hypothesis testing, multivariate statistical analysis, time series techniques, optimization techniques, and statistical packages such as R, Python, Java, SQL, Spark etc. Working knowledge in Machine Learning algorithms like Random Forest, Gradient Boosting, Neural Network etc. Proficient in Excel, MS word, PowerPoint, etc. Strong client and team management and planning of large-scale projects with risk assessment Accenture is an equal opportunities employer and welcomes applications from all sections of society and does not discriminate on grounds of race, religion or belief, ethnic or national origin, disability, age, citizenship, marital, domestic or civil partnership status, sexual orientation, gender identity, or any other basis as protected by applicable law.
Posted 1 week ago
6.0 - 11.0 years
10 - 18 Lacs
Bengaluru
Remote
We are looking for experienced DBAs worked on multiple database technologies and cloud migration projects. 6+ years of experience working on SQL/NoSQL/Data warehouse platforms on on-premise and cloud (AWS, Azure & GCP) Provide expert-level guidance on cloud adoption, data migration strategies, and digital transformation projects Strong understanding of RDBMS, NoSQL, Datawarehouse, In Memory and Data Lake architecture, features, and functionalities Proficiency in SQL and data manipulation techniques. Experience with data loading and unloading tools and techniques. Expertise in Data Access Management, Database reliability & scalability and Administer, configure, and optimize database resources and services across the organization Ensure high availability, replication, and failover strategies Implement serverless database architectures for cost-effective, scalable storage Key Responsibilities Strong proficiency in Database administration of one or more databases (Snowflake, BigQuery, Amazon Redshift, Teradata, SAP HANA, Oracle, PostgreSQL, MySQL, SQL Server, Cassandra, MongoDB, Neo4j, Cloudera, Micro Focus, IBM DB2, Elasticsearch, DynamoDB, Azure synapse ) Plan and Execute the On-Prem Database/Analysis Services/Reporting Services/Integration Services Migration to AWS/Azure/GCP Develop automation scripts using Python, Shell Scripting, or Terraform for streamlined database operations. Provide technical guidance and mentoring to junior DBAs and data engineers. Hands-on experience with data modelling, ETL/ELT processes, and data integration tools. Monitoring and optimizing the performance of virtual warehouses, queries, and overall system performance. Optimize database performance through query tuning, indexing, and configuration. Manage replication, backups, and disaster recovery for high availability. Troubleshoot and resolve database issues, including performance bottlenecks, errors, and downtime. Collaborate with the infrastructure team to configure, manage, and monitor PostgreSQL in cloud environments (AWS, GCP, or Azure). Provide on-call support for critical database operations and incidents Provide Level 3 and 4 technical support, troubleshooting complex issues. Participate in cross-functional teams for database design and optimization.
Posted 1 week ago
6.0 - 11.0 years
10 - 18 Lacs
Bengaluru
Remote
We are looking for experienced DBAs worked on multiple database technologies and cloud migration projects for our clients worldwide. 6+ years of experience working on SQL/NoSQL/Data warehouse platforms on on-premise and cloud (AWS, Azure & GCP) Provide expert-level guidance on cloud adoption, data migration strategies, and digital transformation projects Strong understanding of RDBMS, NoSQL, Datawarehouse, In Memory and Data Lake architecture, features, and functionalities Proficiency in SQL and data manipulation techniques. Experience with data loading and unloading tools and techniques. Expertise in Data Access Management, Database reliability & scalability and Administer, configure, and optimize database resources and services across the organization Ensure high availability, replication, and failover strategies Implement serverless database architectures for cost-effective, scalable storage Key Responsibilities Strong proficiency in Database administration of one or more databases (Snowflake, BigQuery, Amazon Redshift, Teradata, SAP HANA, Oracle, PostgreSQL, MySQL, SQL Server, Cassandra, MongoDB, Neo4j, Cloudera, Micro Focus, IBM DB2, Elasticsearch, DynamoDB, Azure synapse ) Plan and Execute the On-Prem Database/Analysis Services/Reporting Services/Integration Services Migration to AWS/Azure/GCP Develop automation scripts using Python, Shell Scripting, or Terraform for streamlined database operations. Provide technical guidance and mentoring to junior DBAs and data engineers. Hands-on experience with data modelling, ETL/ELT processes, and data integration tools. Monitoring and optimizing the performance of virtual warehouses, queries, and overall system performance. Optimize database performance through query tuning, indexing, and configuration. Manage replication, backups, and disaster recovery for high availability. Troubleshoot and resolve database issues, including performance bottlenecks, errors, and downtime. Collaborate with the infrastructure team to configure, manage, and monitor PostgreSQL in cloud environments (AWS, GCP, or Azure). Provide on-call support for critical database operations and incidents Provide Level 3 and 4 technical support, troubleshooting complex issues. Participate in cross-functional teams for database design and optimization.
Posted 1 week ago
5.0 - 10.0 years
12 - 22 Lacs
Hyderabad, Ahmedabad
Hybrid
Job Role : • Need strong Snowflake Developer with good experience in SQL Development and Data Analysis required to develop a new complex data warehouse. • In-depth knowledge of Azure Cloud services • Strong Snowflake development experience • Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe,Able to administer and monitor Snowflake computing platform • Hands on experience with data load and manage cloud DB • Experience in creation and modification of user accounts and security groups per request • Handling large and complex sets of XML, JSON, and CSV from various sources and databases • Solid grasp of database engineering and design • Experience with any scripting languages, preferably Python Technical Skills Required: • Snowflake • Experience in some other SQL based databases, like Teradata, Oracle SQL Server etc. Nice to have: • Scripting with Python • SnowPro Certification • Experience with an ETL tool, like Informatica, Datastage, Matillion etc.
Posted 1 week ago
1.0 - 3.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Type : Internship Duration : 6 Months Key Responsibilities: Collect, process, and analyze data from various internal systems and external sources. Develop and maintain dashboards, reports, and visualizations to support business objectives. Identify trends, anomalies, and opportunities through data analysis and communicate findings to stakeholders. Collaborate with business, product, and engineering teams to define KPIs and performance metrics. Perform exploratory data analysis to uncover business insights and support strategic planning. Use statistical techniques to develop predictive models or support A/B testing. Ensure data quality and integrity by validating datasets and troubleshooting discrepancies. Document data processes and maintain version control for reports and code. Requirements: Bachelors degree in Statistics, Mathematics, Computer Science, Economics, or a related field. 1-3 years of experience as a Data Analyst or in a related analytical role. Proficient in SQL, Excel, and at least one data visualization tool (e.g., Power BI, Tableau, Looker). Experience with data analysis tools/languages such as Python, R, or similar. Strong analytical and problem-solving skills with attention to detail. Excellent communication skills and the ability to present complex data in a clear and actionable manner. Experience in e-commerce industry Understanding of basic statistical methods and predictive analytics. Should have minimum 6 months career gap at present.
Posted 1 week ago
2.0 - 5.0 years
2 - 6 Lacs
Bengaluru
Work from Office
About Company Kinara Capital is a FinTech NBFC dedicated to driving Financial Inclusion in the MSME sector. Our mission is to transform lives, livelihoods, and local economies by providing fast and flexible loans without property collateral to small business entrepreneurs. Led by a women-majority management team, Kinara Capital values diversity and inclusion and fosters a collaborative working environment. Kinara Capital is the only company from India recognized globally by the World Bank/IFC with a gold award in 2019 as ‘Bank of the Year-Asia’ for our innovative work in SME financing. Kinara Capital is an RBI-registered Systemically Important NBFC. Headquartered in Bangalore, we have 127 branches across Karnataka, Gujarat, Maharashtra, Andhra Pradesh, Telangana, Tamil Nadu, and UT Puducherry with more than 1000 employees. https://kinaracapital.com/ Job Title: BI Engineer Department : Data Science & BI Report To: BI Specialist - Assistant Functional Manager Purpose of Job: To lead a team through entire analytical and BI life cycle of BI analysts to build and deploy dashboards, create automation solutions (such as reports) to infuse core business functions with deep analytical insights. Job Responsibilities: Developing, maintaining, and managing advanced reporting, analytics, dashboards and other BI solutions. Performing and documenting data analysis, data validation, and data mapping/design. Coordinate with customers and Business Analysts to determine business data reporting and analysis requirements. Assist in business report generation to internal and external customers for business decision making. Support Database Administrators and Developers to build data warehousing systems for business intelligence, reporting and analytics. Assist in data administration, modelling and integration activities in data warehouse systems. Implement business intelligence solutions to achieve data reporting and analysis goals. Support the development of business intelligence standards to meet business goals. Coordinate with the business units to identify new data requirements, analysis strategies and reporting mechanisms. Assist in design and generation of reports in timely and accurate manner. Qualifications: Education: B.Tech/B.E. Work Experience: 3+ years as an analytical professional role with 3+ years of BI experience with tools like Talend, Tableau Age: Aged below 35yrs. Other Requirements: Domain knowledge in Financial Services or Marketing is a big plus. Adept at the use of BI reporting tools such as Tableau or Metabase. Understanding data modelling, data schemas (MySQL, Aurora, Snowflake, etc. Understanding database operations and optimization for SQL. Understanding data and query optimization, query profiling, and query performance monitoring tools and techniques. Creating and maintaining business requirements and other technical documentation. Knowledge of ETL, AWS, DBT and GIT will be added advantage. Skills & Competencies Skills Technical Skills Aptitude in Math and Stats Proven experience in usage of SQL, Talend ETL, and Tableau Comfortable with programming in Python and basic statistics Soft Skills Deep Curiosity and Humility Excellent storyteller and communicator Competencies High social responsibility and mission-driven culture Team player High Integrity and focus on Building relationships Inspiring Leader who instills confidence People-oriented and Goal-oriented Initiative Energetic Bias for Action and a commitment to our social mission Place of work: Head office, Bangalore. Job Type: Full Time No. of Posts:
Posted 1 week ago
2.0 - 6.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Type : Internship Duration : 6 Months About Phoenix Phoenix is Myntras initiative specifically designed to offer a launchpad to women on career break. It is a six month internship that ensures a conducive environment facilitating a smooth transition back to work. With structured on-boarding, customized learning and development programs, mentorship opportunities, on the job learning and best in class benefits, we aim to provide an environment that is supportive, so that you can re-discover your career with us. During your internship with us, you will get the opportunity to work with the best talent in the e-commerce industry and work on projects that match your interest, abilities and could lead to full-time employment with Myntra. Key Responsibilities: Collect, process, and analyze data from various internal systems and external sources. Develop and maintain dashboards, reports, and visualizations to support business objectives. Identify trends, anomalies, and opportunities through data analysis and communicate findings to stakeholders. Collaborate with business, product, and engineering teams to define KPIs and performance metrics. Perform exploratory data analysis to uncover business insights and support strategic planning. Use statistical techniques to develop predictive models or support A/B testing. Ensure data quality and integrity by validating datasets and troubleshooting discrepancies. Document data processes and maintain version control for reports and code. Requirements: Bachelors degree in Statistics, Mathematics, Computer Science, Economics, or a related field. 1-3 years of experience as a Data Analyst or in a related analytical role. Proficient in SQL, Excel, and at least one data visualization tool (e.g., Power BI, Tableau, Looker). Experience with data analysis tools/languages such as Python, R, or similar. Strong analytical and problem-solving skills with attention to detail. Excellent communication skills and the ability to present complex data in a clear and actionable manner. Experience in e-commerce industry Understanding of basic statistical methods and predictive analytics. Should have minimum 6 months career gap at present.
Posted 1 week ago
3.0 - 5.0 years
12 - 18 Lacs
Bengaluru
Work from Office
skills: React, python, django, PostgreSQL, MongoDB Exp: 3-5 years REQUIREMENTS Proven experience as a Full Stack Developer or similar role Comfortable with Golang, Scala, Python, and Kafka, or the desire to learn these technologies Experience Required Candidate profile in front-end web development helping to create customer facing user interfaces; experience with ReactJS a plus Familiarity with databases and data warehousing such as PostgreSQL, MongoDB, Snowflake
Posted 1 week ago
10.0 - 15.0 years
35 - 45 Lacs
Bengaluru
Work from Office
Seeking a Technical Lead – MDM Solutions with 10+ years of data management experience and expertise in implementing MDM architectures, data governance, and cloud integration using platforms like Snowflake and AWS.
Posted 1 week ago
16.0 - 18.0 years
30 - 36 Lacs
Bengaluru
Work from Office
Data strategist, team leader, and financial advisor to clients. Strong understanding of finance & accounting principles End-to-end BI lifecycle, from solution architecture to insights delivery DAX, ETL - SQL, AZURE - PowerBI - MSBI, DWH, Provident fund Health insurance Annual bonus
Posted 1 week ago
6.0 - 8.0 years
0 Lacs
Noida, Gurugram
Hybrid
Skills Matrix: - Snowflake Data Build Tool (DBT) SQL Snowflake Developer Openings only for Gurugram/Noida.
Posted 1 week ago
10.0 - 20.0 years
25 - 30 Lacs
Bengaluru
Remote
Role & responsibilities Data Platform : Snowflake, dbt, Fivetran, Oracle OCI Visualization : Tableau Cloud & Identity : Azure, Microsoft Entra (Entra ID / Azure AD) Infrastructure as Code : OpenTofu (Terraform alternative) - Migration from Terraform Scripting & Monitoring : SQL, Python/Bash, monitoring tools
Posted 1 week ago
10.0 - 20.0 years
25 - 35 Lacs
Pune
Remote
Role & responsibilities We are seeking a Production Support Lead with expertise in modern data platforms to oversee the reliability, performance, and user access control of our analytics and reporting environment. This individual will lead operational support across tools like Snowflake, dbt, Fivetran, Tableau, and Azure Entra, ensuring compliance and high data availability. The ideal candidate will not only resolve technical issues but also guide the team in scaling and automating platform operations. Key Responsibilities: Snowflake Platform Management: Oversee production operations, including query performance, object dependencies, warehouse sizing, replication setup, and role management. Data Ingestion Support: Monitor and manage Fivetran pipelines for data ingestion from Oracle OCI to Snowflake. Transformation Layer Oversight: Maintain and troubleshoot dbt jobs, ensuring timely and accurate data transformations. Tableau Operations: Validate data sources, dashboard usage, performance, and version control Manage user access and role-level security Ensure SOX compliance for reporting environments Access & Identity Management: Administer access control via Microsoft Entra, including mapping users/groups to appropriate roles across environments. IAC Operations: Lead and maintain OpenTofu (Terraform alternative) deployments for infrastructure provisioning. Monitoring & Alerts: Set up automated monitoring tools and alerts for proactive detection of system anomalies. Incident & Problem Management: Lead root cause analysis for production issues and coordinate with cross-functional teams for permanent fixes. Compliance & Governance: Ensure platform is audit-ready; document SOPs, user access logs, and remediation procedures. Required Skills: Strong knowledge and hands-on experience with Snowflake, dbt, Tableau, Fivetran, Oracle OCI, OpenTofu/Terraform Proven ability in Azure platform administration, especially Entra ID (formerly Azure AD) Experience with access governance, data security, and SOX compliance Proficient in SQL, scripting (e.g., Python or Bash), and monitoring tools Comfortable leading triage calls, managing escalations, and coordinating between dev, infra, and business teams Preferred Qualifications: Experience leading migrations (e.g., Terraform to OpenTofu) ITIL or equivalent certification in incident/change/problem management Experience working in highly regulated industries (finance, healthcare, manufacturing) Role Expectations: Act as a bridge between engineering, infra, and business stakeholders Own SLAs, root cause reviews, and system documentation Mentor junior support staff and onboard new hires Recommend and implement automation for recurring issues or manual workflows
Posted 1 week ago
15.0 - 20.0 years
10 - 14 Lacs
Gurugram
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Engineering Good to have skills : Microsoft SQL Server, Python (Programming Language), Snowflake Data WarehouseMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Senior Analyst, Data Engineering, you will be part of the Data and Analytics team, responsible for developing and delivering high-quality data assets and managing data domains for Personal Banking customers and colleagues. You will bring expertise in data handling, curation, and conformity, and support the design and development of data solutions that drive business value. You will work in an agile environment to build scalable and reliable data pipelines and platforms within a complex enterprise. Roles & Responsibilities:Hands-on development experience in Data Warehousing and/or Software Development.Utilize tools and best practices to build, verify, and deploy data solutions efficiently.Perform data integration and sourcing activities across various platforms.Develop data assets to support optimized analysis for customer and regulatory outcomes.Provide ongoing support for data platforms, including problem and incident management.Collaborate in Agile software development environments using tools like GitHub, Confluence, and Rally.Support continuous improvement and innovation in data engineering practices. Professional & Technical Skills: Must To Have Skills: Experience with cloud technologies, especially AWS (S3, Redshift, Airflow).Proficiency in DevOps and DataOps tools such as Jenkins, Git, and Erwin.Advanced skills in SQL and Python.Working knowledge of UNIX, Spark, and Databricks. Additional Information:Position:Senior Analyst, Data EngineeringReports to:Manager, Data EngineeringDivision:Personal BankGroup:3Industry/Domain Skills: Experience in Retail Banking, Business Banking, or Wealth Management preferred Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs. Your role will require you to facilitate communication between stakeholders and the development team, ensuring that all parties are informed and engaged throughout the project lifecycle. Additionally, you will monitor project progress and make necessary adjustments to keep the project on track, all while fostering a collaborative and inclusive team environment. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and ETL processes.- Strong understanding of cloud-based data solutions and architecture.- Familiarity with SQL and data querying techniques.- Experience in performance tuning and optimization of data warehouse solutions. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models are aligned with best practices and meet the requirements of the organization, facilitating smooth data integration and accessibility for users across the company. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their understanding of data modeling.- Continuously evaluate and improve data modeling processes to increase efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling concepts and techniques.- Experience with ETL processes and data integration.- Familiarity with data governance and data quality principles.- Ability to work with various data visualization tools to present data models effectively. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while fostering a collaborative environment that encourages innovation and efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with cloud-based data solutions and architecture.- Familiarity with SQL and data querying techniques.- Ability to troubleshoot and optimize data warehouse performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
7.0 - 12.0 years
4 - 8 Lacs
Gurugram
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : Data Engineering, Cloud Data MigrationMinimum 7.5 year(s) of experience is required Educational Qualification : BE or BTech must Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your role involves creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Develop and maintain data solutions for data generation, collection, and processing.- Create data pipelines to ensure efficient data flow.- Implement ETL processes for data migration and deployment. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Good To Have Skills: Experience with Data Engineering.- Strong understanding of data modeling techniques and methodologies.- Experience in cloud data migration.- Knowledge of data engineering principles.- Proficient in ETL processes. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Gurugram office.- A BE or BTech degree is required. Qualification BE or BTech must
Posted 1 week ago
5.0 - 8.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Skill required: Delivery - Data Wrangling Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Design, develop, and adapt data wrangling tools & techniques to explore, transform, map & discover data patterns, and garner business insights. It is the process of transforming and mapping data from one raw data form into another format. What are we looking for Marketing Data Wrangler 4 + years designing and implementing large scale data loading, manipulation, processing solutions Experience working in CPG/Retail/Beauty industries High proficiency in data integration package High proficiency in Snowflake/GCP Experience in streaming integration development Cloud development experience (e.g., AWS, Azure) Strong proficiency in SQL Able to design and implement relational data models Good understanding of DevOps and Agile way of working Eagerness to contribute to a team-oriented environment Ability to work creatively and analytically in a problem-solving environment Excellent leadership, communication (written and oral) and interpersonal skills Roles and Responsibilities: Work in interdisciplinary teams that combine technical, business, visualization, and data science competencies Design and implement solutions around data warehouse implementation ranging from architecture, ETL processes, multidimensional modelling, and data marts implementation Integrate datasets and dataflows using a variety of best-in-class software as well as profile and analyze large and complex datasets from disparate sources Develop scheduling scripts or configure load schedules Design and run unit tests Perform bug diagnosis and fix Migrate code between development and test environments Participate in support of the development environment Self-started who can adapt to change in technology independently Qualification Any Graduation
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France