Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
19 - 23 Lacs
Mumbai
Work from Office
Overview MSCI has an immediate opening in of our fastest growing product lines. As a Lead Architect within Sustainability and Climate, you are an integral part of a team that works to develop high-quality architecture solutions for various software applications on modern cloud-based technologies. As a core technical contributor, you are responsible for conducting critical architecture solutions across multiple technical areas to support project goals. The systems under your responsibility will be amongst the most mission critical systems of MSCI. They require strong technology expertise and a strong sense of enterprise system design, state-of-the-art scalability and reliability but also innovation. Your ability to take technology decisions in a consistent framework to support the growth of our company and products, lead the various software implementations in close partnerships with global leaders and multiple product organizations and drive the technology innovations will be the key measures of your success in our dynamic and rapidly growing environment. At MSCI, you will be operating in a culture where we value merit and track record. You will own the full life-cycle of the technology services and provide management, technical and people leadership in the design, development, quality assurance and maintenance of our production systems, making sure we continue to scale our great franchise. Responsibilities Engages technical teams and business stakeholders to discuss and propose technical approaches to meet current and future needs • Defines the technical target state of the product and drives achievement of the strategy • As the Lead Architect you will be responsible for leading the design, development, and maintenance of our data architecture, ensuring scalability, efficiency, and reliability. • Create and maintain comprehensive documentation for the architecture, processes, and best practices including Architecture Decision Records (ADRs). • Evaluates recommendations and provides feedback on new technologies • Develops secure and high-quality production code, and reviews and debugs code written by others Informaon Classificaon: GENERAL • Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems • Collaborating with a cross functional team to draft, implement and adapt the overall architecture of our products and support infrastructure in conjunction with software development managers, and product management teams • Staying abreast of new technologies and issues in the software-as-a-service industry, including current technologies, platforms, standards and methodologies • Being actively engaged in setting technology standards that impact the company and its offerings • Ensuring the knowledge sharing of engineering best practices across departments; and developing and monitoring technical standards to ensure adherence to them. Qualifications Prior senior Software Architecture roles • Demonstrate proficiency in programming languages such as Python/Java/Scala and knowledge of SQL and NoSQL databases. • Drive the development of conceptual, logical, and physical data models aligned with business requirements. • Lead the implementation and optimization of data technologies, including Apache Spark. • Experience with one of the table formats, such as Delta, Iceberg. • Strong hands-on experience in data architecture, database design, and data modeling. • Proven experience as a Data Platform Architect or in a similar role, with expertise in Airflow, Databricks, Snowflake, Collibra, and Dremio. • Experience with cloud platforms such as AWS, Azure, or Google Cloud. • Ability to dive into details, hands on technologist with strong core computer science fundamentals. • Strong preference for financial services experience • Proven leadership of large-scale distributed software teams that have delivered great products on deadline • Experience in a modern iterative software development methodology • Experience with globally distributed teams and business partners • Experience in building and maintaining applications that are mission critical for customers • M.S. in Computer Science, Management Information Systems or related engineering field • 15+ years of software engineering experience • Demonstrated consensus builder and collegial peer What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 1 month ago
5.0 - 10.0 years
12 - 22 Lacs
Bengaluru
Work from Office
Detailed JD (Roles and Responsibilities) Proficiency in Snowflake, Unix scripting. Experience working in Data Warehouse projects involving Snowflake, exposure in ETL, Data processing Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy Good exposure in Python scripting Good exposure in SQL concepts Good understanding of Datawarehouse concepts Mandatory skills Snowflake, SQL, Unix Desired/ Secondary skills Python
Posted 1 month ago
4.0 - 9.0 years
0 - 3 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Hi , Greetings from Diverse Lynx ! SKILL : Snowflake Developer Please Share Below Details Total Experience : Relevant Experience : Notice Period : - PAN Number : Graduation Percentage : DOB : CTC : ECTC : JD Desired Competencies (Technical/Behavioral Competency) Must-Have** 1. In depth Knowledge on, Snowflake, SQL , DBT (Data Build Tool)and, Datastage 2. Knowledge of ETL process to facilitate smooth transfers and storage of data 3. Scripting skills using Python Good-to-Have 1. Exposure to job scheduling/monitoring environments(Eg. Control-M) 2. Any Big data tools Hadoop, Spark. 3 . Cloud migration experience Responsibility of / Expectations from the Role Implementing ETL pipelines within and outside of a data warehouse using Python and Snowflakes Snow SQL Work with the end user to get data requirements, performing complex data mapping and developing/maintaining ETL programs to support enterprise data warehouse Develop SQL scripts to load data from source systems and verify the data in target tables load correctly by ELT process Development of scripts using Unix, Python, etc. for loading, extracting, and transforming data. Assist with production issues in Data Warehouses like reloading data, transformations, and translations Develop a Database Design and Reporting Design based on Business Intelligence and Reporting requirements Interested candidates share resume on mail ID (Vidushi.sharma@diverselynx.in) Thanks & Regards, Vidushi Sharma Technical Recruiter Diverse Lynx India Pvt Ltd. M : 8923113032 T: 0120-4604500 | Extn. 1169 Email: Vidushi.sharma@diverselynx.in USA- New Jersey | Canada- Ottawa | UK-London | India-Noida
Posted 1 month ago
8.0 - 13.0 years
25 - 40 Lacs
Pune
Work from Office
What You'll Do Job Description: You will Provide 24/7 administrative support (on-prime and Atlas Cloud) on MongoDB Clusters, Postgres & Snowflake Provide support for on-prime and Confluent Cloud Kafka Clusters You will Review database designs to ensure all technical and our requirements are met. Perform database Optimization, testing to ensure Service level agreements are met. You will support during system implementation and in production Provide Support for Snowflake Administrative Tasks (Data Pipeline, Object creation, Access) Participate in Weekdays and Weekend Oncall Rotation to support Products running on Mongo, SQL, Kafka & Snowflake, and other RDBMS Systems. This roles does not have any managerial responsibilities. Its an individual contributor role. You will report to Sr. Manager Reliability Engineering. What Your Responsibilities Will Be 8+ years of experience in Managing MongoDB on-prime and Atlas Cloud Be an part of the database team in developing next-generation database systems. Provide services in administration and performance monitoring of database related systems. Develop system administration standards and procedures to maintain practices. Support backup and recovery strategies. Provide in the creative process to improving architectural designs and implement new architectures Expertise in delivering efficiency and cost effectiveness. Monitor and support capacity planning and analysis. Monitor performance, troubleshoot issues and proactively tune database and workloads. Sound knowledge Terraform, Grafana and Manage Infra as a code using Terraform & Gitlab. Ability to work remotely. What You'll Need to be Successful Working knowledge of MongoDB (6.0 or above). Experience with Sharding and Replica sets. Working knowledge of database installation, setup, creation, and maintenance processes. Working knowledge on Change Streams and Mongo ETL's to replicate live changes to downstream Analytics systems. Experience running MongoDB in containerized environment (EKS clusters) Support Reliability Engineering task for all other database platform (SQL, MYSQL, Postgres, Snowflake, Kafka). Experience with Cloud or Ops Manager (a plus) Understand Networking components on aws and gcp cloud. Technical knowledge of Backup/ Recovery. Disaster Recovery and High Availability techniques Strong technical knowledge in writing shell scripts used to support database administration. Good Understanding of Kafka and Snowflake Administration. Good Understanding of Debezium, Kafka, Zookeeper and Snowflake is plus. Automate Database Routine tasks Independently with shell, python and other languages.
Posted 1 month ago
10.0 - 15.0 years
30 - 45 Lacs
Pune
Work from Office
What You'll Do The Global Analytics & Insights (GAI) team is seeking a Data & Analytics Engineering Manager to lead our team in designing, developing, and maintaining data pipelines and analytics infrastructure. As a Data & Analytics Engineering Manager, you will play a pivotal role in empowering a team of engineers to build and enhance analytics applications and a modern data platform using Snowflake, dbt (Data Build Tool), Python, Terraform, and Airflow. You will become an expert in Avalaras financial, marketing, sales, and operations data. The ideal candidate will have deep SQL experience, an understanding of modern data stacks and technology, demonstrated leadership and mentoring experience, and an ability to drive innovation and manage complex projects. This position will report to Senior Manager. What Your Responsibilities Will Be Mentor a team of data engineers, providing guidance and support to ensure a high level of quality and career growth Lead a team of data engineers in the development and maintenance of data pipelines, data modelling, code reviews and data products Collaborate with cross-functional teams to understand requirements and translate them into scalable data solutions Drive innovation and continuous improvements within the data engineering team Build maintainable and scalable processes and playbooks to ensure consistent delivery and quality across projects Drive adoption of best practices in data engineering and data modelling Be the visible lead of the team- coordinate communication, releases, and status to various stakeholders What You'll Need to be Successful Bachelor's degree in Computer Science, Engineering, or related field 10+ years experience in data engineering field, with deep SQL knowledge 2+ years management experience, including direct technical reports 5+ years experience with data warehousing concepts and technologies 4+ years of working with Git, and demonstrated experience using these tools to facilitate growth of engineers 4+ years working with Snowflake 3+ years working with dbt (dbt core preferred) Preferred Qualifications: Snowflake, Dbt, AWS Certified 3+ years working with Infrastructure as Code, preferably Terraform 2+ years working with CI CD, and demonstrated ability to build and operate pipelines Experience and understanding of Snowflake administration and security principles Demonstrated experience with Airflow
Posted 1 month ago
3.0 - 8.0 years
8 - 16 Lacs
Bengaluru
Work from Office
Job Description: Business System Analyst(3+yrs) Location: Bangalore(Hybrid Mode) Primary Skill: SQL, Snowflake, SDLC, US Healthcare domain, Strong communication & documentation. QUALIFICATION: Bachelors or masters degree JOB RESPONSIBILITY: Work closely with business stakeholders to understand their needs, objectives, and challenges. Elicit, document, and analyze business requirements, processes, and workflows. Translate business requirements into clear and concise functional specifications for technical teams. Collaborate with technology teams to design solutions that meet business needs. Propose innovative and practical solutions to address business challenges. Ensure that proposed solutions align with the organization's strategic goals and technological capabilities. Identify areas for process optimization and efficiency enhancement. Recommend process improvements and assist in their implementation. Must have very good knowledge on Health care domain and SQL. Good to have AWS and Snowflake technologies. Hands on Complex SQL queries (Snowflake) Knowledge of database management systems, both relational and non-relational Familiarity with data integration and ETL tools. Sri Lalitha 92810 37167 sri.lalitha@spsoftglobal.com
Posted 1 month ago
3.0 - 4.0 years
12 - 15 Lacs
Hyderabad
Work from Office
We are seeking a versatile and passionate Full Stack Developer to contribute across our entire application stack. You will be responsible for both front-end user interfaces and back-end services, ensuring seamless integration, optimal performance, and an exceptional user experience. This role requires a strong understanding of modern web technologies and the ability to adapt to diverse technical challenges. Required Skills 3+ years of experience developing full-stack web applications. Frontend Expertise: Proficiency in React.js for building interactive and responsive user interfaces. Strong command of TypeScript and JavaScript, HTML5, and CSS3. Backend Expertise: Solid experience with Python, particularly FastAPI, and Django (including Django REST Framework - DRF). Practical experience with MySQL and other relational databases. Experience with Kafka for building event-driven systems. Experience deploying and managing applications on cloud services. Demonstrated ability to design and implement RESTful APIs and work with GraphQL. Proficiency in version control systems (e.g., Git). Proven track record in coding, testing, and process adherence. Exceptional problem-solving capabilities and analytical skills, with a keen eye for both front-end usability and back-end scalability. Excellent verbal and written communication skills in English for effective cross-functional collaboration. Preferred Qualifications: Familiarity with WebSockets for real-time features. Understanding of containerization technologies like Docker and orchestration with Kubernetes. Exposure to Generative AI libraries and other AI/ML Python libraries for integrating intelligent features across the stack. Experience with NoSQL databases like ArangoDB or vector databases. Knowledge of Redis for caching. Familiarity with Snowflake or Spark for data processing and analysis. Experience with Meltano for data integration pipelines. Exposure to HAProxy for load balancing. Proficiency in other programming languages in our stack (e.g., PHP). Personal Attributes : Self-motivated and proactive with a detail-oriented approach to software development as an Individual contributor. Strong interpersonal skills and the ability to engage effectively with various stakeholders.
Posted 1 month ago
7.0 - 12.0 years
15 - 22 Lacs
Hyderabad, Pune
Work from Office
Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications. Perks and benefits: Mention available facilities and benefits the company is offering with this job.
Posted 1 month ago
6.0 - 10.0 years
10 - 15 Lacs
Kolkata, Pune, Bengaluru
Work from Office
Practice Overview: Skill/Operating Group Technology Consulting Level Consultant Location Gurgaon/Mumbai/Bangalore/Kolkata/Pune Travel Percentage Expected Travel could be anywhere between 0-100% Principal Duties And Responsibilities: Working closely with our clients, Consulting professionals design, build and implement strategies that can help enhance business performance. They develop specialized expertisestrategic, industry, functional, technicalin a diverse project environment that offers multiple opportunities for career growth. The opportunities to make a difference within exciting client initiatives are limitless in this ever-changing business landscape. Here are just a few of your day-to-day responsibilities. Architect large scale data lake, DW, and Delta Lake on cloud solutions using AWS, Azure, GCP, Ali Cloud, Snowflake, Hadoop, or Cloudera Design Data Mesh Strategy and Architecture Build strategy and roadmap for data migration to cloud Establish Data Governance Strategy & Operating Model Implementing programs/interventions that prepare the organization for implementation of new business processes Deep understanding of Data and Analytics platforms, data integration w/ cloud Provide thought leadership to the downstream teams for developing offerings and assets Identifying, assessing, and solvingcomplex business problemsfor area of responsibility, where analysis of situations or data requires an in-depth evaluation of variable factors Overseeing the production and implementation of solutions covering multiple cloud technologies, associated Infrastructure / application architecture, development, and operating models Called upon to apply your solid understanding of Data, Data on cloud and disruptive technologies. Driving enterprise business, application, and integration architecture Helping solve key business problems and challenges by enabling a cloud-based architecture transformation, painting a picture of, and charting a journey from the current state to a to-be enterprise environment Assisting our clients to build the required capabilities for growth and innovation to sustain high performance Managing multi-disciplinary teams to shape, sell, communicate, and implement programs Experience in participating in client presentations & orals for proposal defense etc. Experience in effectively communicating the target state, architecture & topology on cloud to clients Qualification Qualifications: Bachelors degree MBA Degree from Tier-1 College (Preferable) 6-10 years of large-scale consulting experience and/or working with hi tech companies in data architecture, data governance, data mesh, data security and management. Certified on DAMA (Data Management) or Azure Data Architecture or Google Cloud Data Analytics or AWS Data Analytics Experience: We are looking for experienced professionals with Data strategy, data architecture, data on cloud, data modernization, data operating model and data security experience across all stages of the innovation spectrum, with a remit to build the future in real-time. The candidate should have practical industry expertise in one of these areas - Financial Services, Retail, consumer goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources. Key Competencies and Skills: The right candidate should have competency and skills aligned to one or more of these archetypes - Data SME - Experience in deal shaping & strong presentation skills, leading proposal experience, customer orals; technical understanding of data platforms, data on cloud strategy, data strategy, data operating model, change management of data transformation programs, data modeling skills. Data on Cloud Architect - Technical understanding of data platform strategy for data on cloud migrations, big data technologies, experience in architecting large scale data lake and DW on cloud solutions. Experience one or more technologies in this space:AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera Data Strategy - Data Capability Maturity Assessment, Data & Analytics / AI Strategy, Data Operating Model & Governance, Data Hub Enablement, Data on Cloud Strategy, Data Architecture Strategy Data Transformation Lead - Understanding of data supply chain and data platforms on cloud, experience in conducting alignment workshops, building value realization framework for data transformations, program management experience Exceptional interpersonal and presentation skills - ability to convey technology and business value propositions to senior stakeholders Capacity to develop high impact thought leadership that articulates a forward-thinking view of the market Other desired skills - Strong desire to work in technology-driven business transformation Strong knowledge of technology trends across IT and digital and how they can be applied to companies to address real world problems and opportunities. Comfort conveying both high-level and detailed information, adjusting the way ideas are presented to better address varying social styles and audiences. Leading proof of concept and/or pilot implementations and defining the plan to scale implementations across multiple technology domains Flexibility to accommodate client travel requirements Published Thought leadership Whitepapers, POVs
Posted 1 month ago
4.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Skill- Talend with SQL and DBT and snowflake knowledge Talend - Designing, developing, and documenting existing Talend ETL processes, technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data. Strong on DBT with Snowflake SQL Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and ever improvement approach. Desirable to have Talend / Snowflake Certification Excellent SQL coding skills Excellent communication & documentation skills. Familiar with Agile delivery process. Must be analytic, creative and self-motivated. Work Effectively within a global team environment. Excellent Communication skills Exp-4+ Location-Chennai NP- immediate to 30days max
Posted 1 month ago
5.0 - 8.0 years
15 - 22 Lacs
Noida, Bengaluru, Delhi / NCR
Hybrid
HI Candidates, we have an opportunities with one of the leading IT consulting Group for the data engineer role. Interested candidates can mail their CV's at Abhishek.saxena@mounttalent.com Job Description- What were looking for Data Engineer III with: 5+ years of experience with ETL Process, Data warehouse architecture 5+ Years of experience with Azure Data services i.e. ADF, ADLS Gen 2, Azure SQL dB, Synapse, Azure Databricks, Microsoft Fabric 5+ years of experience designing business intelligence solutions Strong proficiency in SQL and Python/pyspark Implementation experience of Medallion architecture and delta lake (or lakehouse) Experience with cloud-based data platforms, preferably Azure Familiarity with big data technologies and data warehousing concepts Working knowledge of Azure DevOps and CICD (build and release)
Posted 1 month ago
8.0 - 13.0 years
25 - 37 Lacs
Hyderabad
Work from Office
SQL & Database Management: Deep knowledge of relational databases (PostgreSQL), cloud-hosted data platforms (AWS, Azure, GCP), and data warehouses like Snowflake. ETL/ELT Tools: Experience with SnapLogic, StreamSets, or DBT for building and maintaining data pipelines. / ETL Tools Extensive Experience on data Pipelines Data Modeling & Optimization: Strong understanding of data modeling, OLAP systems, query optimization, and performance tuning. Cloud & Security: Familiarity with cloud platforms and SQL security techniques (e.g., data encryption, TDE). Data Warehousing: Experience managing large datasets, data marts, and optimizing databases for performance. Agile & CI/CD: Knowledge of Agile methodologies and CI/CD automation tools.
Posted 1 month ago
4.0 - 8.0 years
6 - 10 Lacs
Pune, Gurugram
Work from Office
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. Business Technology ZS s Technology group focuses on scalable strategies, assets and accelerators that deliver to our clients enterprise-wide transformation via cutting-edge technology. We leverage digital and technology solutions to optimize business processes, enhance decision-making, and drive innovation. Our services include, but are not limited to, Digital and Technology advisory, Product and Platform development and Data, Analytics and AI implementation. What you ll do Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements; Apply appropriate development methodologies (e.g.agile, waterfall) and best practices (e.g. mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments; Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management; Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management; Bring transparency in driving assigned tasks to completion and report accurate status; Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams; Assist senior team members, delivery leads in project management responsibilities What you ll bring Big Data TechnologiesProficiency in working with big data technologies, particularly in the context of Azure Databricks, which may include Apache Spark for distributed data processing. Azure DatabricksIn-depth knowledge of Azure Databricks for data engineering tasks, including data transformations, ETL processes, and job scheduling. SQL and Query OptimizationStrong SQL skills for data manipulation and retrieval, along with the ability to optimize queries for performance in Snowflake. ETL (Extract, Transform, Load)Expertise in designing and implementing ETL processes to move and transform data between systems, utilizing tools and frameworks available in Azure Databricks. Data IntegrationExperience with integrating diverse data sources into a cohesive and usable format, ensuring data quality and integrity. Python/PySparkKnowledge of programming languages like Python and PySpark for scripting and extending the functionality of Azure Databricks notebooks. Version ControlFamiliarity with version control systems, such as Git, for managing code and configurations in a collaborative environment. Monitoring and OptimizationAbility to monitor data pipelines, identify bottlenecks, and optimize performance for both Azure Data Factory Security and ComplianceUnderstanding of security best practices and compliance considerations when working with sensitive data in Azure and Snowflake environments. Snowflake Data WarehouseExperience in designing, implementing, and optimizing data warehouses using Snowflake, including schema design, performance tuning, and query optimization. Healthcare Domain Knowledge: Familiarity with US health plan terminologies and datasets is essential. Programming/Scripting Languages: Proficiency in Python, SQL, and PySpark is required. Cloud Platforms: Experience with AWS or Azure, specifically in building data pipelines, is needed. Cloud-Based Data Platforms: Working knowledge of Snowflake and Databricks is preferred. Data Pipeline Orchestration: Experience with Azure Data Factory and AWS Glue for orchestrating data pipelines is necessary. Relational Databases: Competency with relational databases such as PostgreSQL and MySQL is required, while experience with NoSQL databases is a plus. BI Tools: Knowledge of BI tools such as Tableau and PowerBI is expected. Version Control: Proficiency with Git, including branching, merging, and pull requests, is required. CI/CD for Data Pipelines: Experience in implementing continuous integration and delivery for data workflows using tools like Azure DevOps is essential. Additional Skills Experience with front-end technologies such as SQL, JavaScript, HTML, CSS, and Angular is advantageous. Familiarity with web development frameworks like Flask, Django, and FAST API is beneficial. Basic knowledge of AWS CI/CD practices is a plus. Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams; Proven ability to work creatively and analytically in a problem-solving environment; Willingness to travel to other global offices as needed to work with client or other internal project teams. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered.
Posted 1 month ago
9.0 - 14.0 years
11 - 16 Lacs
Hyderabad
Work from Office
Overview We are looking for a strategic and hands-on Architect specializing in Real-Time Decisioning(RTD) to lead the design and implementation of intelligent, data-driven customer engagement solutions. With over 9 years of experience, the ideal candidate will bring deep technical expertise in real-time decision-making platforms and marketing technologies to drive personalization, automation, and optimized customer experiences across digital channels. The main purpose of the role is to provide architectural design governance and technical leadership, develop and deliver customized solutions within the Real Time Decisioning (RTD) platforms to support critical business functions, and meet project Responsibilities Design and implement Real-Time Decisioning initiatives to support next-best-action strategies across web, mobile, email, and contact center touchpoints. Translate business goals into decisioning logic, data models, and integration strategies. Work with RTD Business Analyst, Sector Product Owner, Sector IT, and Marketing teams to transform new requirements into best practice-led technical design. Design and implement real-time personalization use cases using Salesforce Marketing Cloud Personalization(MCP) or Interaction Studio, CleverTap capabilities (triggers, campaigns, decisions, and Einstein). Work directly with stakeholders to design/govern highly usable, scalable, extensible, and maintainable Salesforce solutions. Define and implement data integrations between Salesforce Marketing Cloud, CDP, CRM, and other platforms. Deal with ambiguous problems, take responsibility for finding solutions and drive towards simple solutions to complex problems. Troubleshoot key implementation issues and demonstrate the ability to drive to a successful resolution. Use deep business knowledge of RTD to assist with estimation for major new initiatives. Provide oversight to the development team (up to 5 resources) and ensure sound technical delivery of the product Design and implement complex solutions in the personalization of Mobile Apps and Websites, etc Implement integrations with other systems using SDKs and APIs Contribute to RTD CoE building activities by creating reference architectures, common patterns, data models, and re-usable assets that empower our stakeholders to maximize business value using the breadth of the Salesforce solutions available, also harvesting knowledge from existing implementations. Evangelize and educate internal stakeholders about RTD technologies Qualifications Bachelors degree in IT, Computer Science, or equivalent 9-14 plus years of IT experience with at least 5+ years of experience with Real-time decisioning and personalization tools like Marketing Cloud personalization (MCP) or Interaction Studio, CleverTap, etc Strong understanding of customer data models, behavioural analytics, segmentation, and machine learning models. Experience with APIs, real-time event processing, and data pipelines. Familiarity with cloud environments (AWS, Azure, GCP) and data platforms (e.g., Snowflake, BigQuery). Hands-on experience with rules engines, decision tables, and AI-based recommendations. Excellent problem-solving, communication, and stakeholder management skills. Experience developing customer-facing user interfaces with Lightning Components Agile delivery experience, Self-motivated and creative, Good communication and interpersonal skills Experience in providing technical governance on architectural design and leading a development team in a technical capacity Motivated self-starter, able to change directions quickly when priorities shift and quickly think through problems to design and deliver solutions Passion for technology and for learning
Posted 1 month ago
3.0 - 6.0 years
15 - 25 Lacs
Chennai, Bengaluru
Hybrid
Job Description: We are seeking a highly experienced and skilled Senior Data Engineer to join our dynamic team. This role requires hands-on experience with databases such as Snowflake and Teradata, as well as advanced knowledge in various data science and AI techniques. The successful candidate will play a pivotal role in driving data-driven decision-making and innovation within our organization. Roles and Responsibilities: Design, develop, and implement advanced machine learning models to solve complex business problems. Apply AI techniques and generative AI models to enhance data analysis and predictive capabilities. Utilize Tableau and other visualization tools to create insightful and actionable dashboards for stakeholders. Manage and optimize large datasets using Snowflake and Teradata databases. Collaborate with cross-functional teams to understand business needs and translate them into analytical solutions. Stay updated with the latest advancements in data science, machine learning, and AI technologies. Mentor and guide junior data scientists, fostering a culture of continuous learning and development. Communicate complex analytical concepts and results to non-technical stakeholders effectively. We are an equal opportunity employer and value diversity at our company. We do not discriminate based on race, religion, colour, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status.
Posted 1 month ago
8.0 - 13.0 years
25 - 40 Lacs
Bengaluru
Work from Office
*Must-Have Skills:* * Azure Databricks / PySpark hands-on * SQL/PL-SQL advanced level * Snowflake – 2+ years * Spark/Data pipeline development – 2+ years * Azure Repos / GitHub, Azure DevOps * Unix Shell Scripting * Cloud technology experience *Key Responsibilities:* 1. *Design, build, and manage data pipelines using Azure Databricks, PySpark, and Snowflake. 2. *Analyze and resolve production issues (Tier 2 support with weekend/on-call rotation). 3. *Write and optimize complex SQL/PL-SQL queries. 4. *Collaborate on low-level and high-level design for data solutions. 5. *Document all project deliverables and support deployment. Good to Have: Knowledge of Oracle, Qlik Replicate, GoldenGate, Hadoop Job scheduler tools like Control-M or Airflow Behavioral: Strong problem-solving & communication skills
Posted 1 month ago
5.0 - 10.0 years
17 - 30 Lacs
Hyderabad
Remote
At Mitratech, we are a team of technocrats focused on building world-class products that simplify operations in the Legal, Risk, Compliance, and HR functions of Fortune 100 companies. We are a close-knit, globally dispersed team that thrives in an ecosystem that supports individual excellence and takes pride in its diverse and inclusive work culture centered around great people practices, learning opportunities, and having fun! Our culture is the ideal blend of entrepreneurial spirit and enterprise investment, enabling the chance to move at a rapid pace with some of the most complex, leading-edge technologies available. Given our continued growth, we always have room for more intellect, energy, and enthusiasm - join our global team and see why it's so special to be a part of Mitratech! Job Description We are seeking a highly motivated and skilled Analytics Engineer to join our dynamic data team. The ideal candidate will possess a strong background in data engineering and analytics, with hands-on experience in modern analytics tools such as Airbyte, Fivetran, dbt, Snowflake, Airflow, etc. This role will be pivotal in transforming raw data into valuable insights, ensuring data integrity, and optimizing our data infrastructure to support the organization's data platform. Essential Duties & Responsibilities Data Integration and ETL Processes: Design, implement, and manage ETL pipelines using tools like Airbyte and Fivetran to ensure efficient and accurate data flow from various sources into our Snowflake data warehouse. Maintain and optimize existing data integration workflows to improve performance and scalability. Data Modeling and Transformation: Develop and maintain data models using dbt / dbt Cloud to transform raw data into structured, high-quality datasets that meet business requirements. Ensure data consistency and integrity across various datasets and implement data quality checks. Data Warehousing: Manage and optimize our Redshift / Snowflake data warehouses, ensuring it meets performance, storage, and security requirements. Implement best practices for data warehouse management, including partitioning, clustering, and indexing. Collaboration and Communication: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions that meet their needs. Communicate complex technical concepts to non-technical stakeholders in a clear and concise manner. Continuous Improvement: Stay updated with the latest developments in data engineering and analytics tools, and evaluate their potential to enhance our data infrastructure. Identify and implement opportunities for process improvements, automation, and optimization within the data pipeline. Requirements & Skills: Education and Experience: Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. 3-5 years of experience in data engineering or analytics engineering roles. Experience in AWS and DevOps is a plus. Technical Skills: Proficiency with modern ETL tools such as Airbyte and Fivetran. Must have experience with dbt for data modeling and transformation. Extensive experience working with Snowflake or similar cloud data warehouses. Solid understanding of SQL and experience writing complex queries for data extraction and manipulation. Familiarity with Python or other programming languages used for data engineering tasks. Analytical Skills: Strong problem-solving skills and the ability to troubleshoot data-related issues. Ability to understand business requirements and translate them into technical specifications. Soft Skills: Excellent communication and collaboration skills. Strong organizational skills and the ability to manage multiple projects simultaneously. Detail-oriented with a focus on data quality and accuracy. We are an equal-opportunity employer that values diversity at all levels. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, national origin, age, sexual orientation, gender identity, disability, or veteran status.
Posted 1 month ago
8.0 - 13.0 years
12 - 16 Lacs
Pune
Work from Office
What You'll Do The Data Science Engineering team is looking for a Lead Data Analytics Engineer to join our team! You should be and gather our requirements, understanding complex product, business, and engineering challenges, composing and prioritizing research projects, and then building them in partnership with cloud engineers and architects, and using the work of our data engineering team. You have deep SQL experience, an understanding of modern data stacks and technology, experience with data and all things data-related, and experience guiding a team through technical and design challenges. You will report into the Sr. Manager, Cloud Software Engineering and be a part of the larger Data Engineering team. What Your Responsibilities Will Be Avalara is looking for data analytics engineer who can solve and scale real world big data challenges. Have end to end analytics experience and a complex data story with data models and reliable and applicable metrics. Build and deploy data science models using complex SQL, Python, DBT data modelling and re-useable visualization components (PowerBI/Tableau/Hex/R-shiny etc.) Expert level experience in PowerBI, SQL and Snowflake Solve needs on a large scale by applying your software engineering and complex data. Lead and help develop a roadmap for the area and the team. Analyze fault tolerance and high availability issues, performance, and scale challenges, and solve them. Lead programs and collaborate with engineers, product managers, and technical program managers across teams. Understand the trade-offs between consistency, durability, and costs to build solutions that can meet the demands of growing services. Ensure the operational readiness of the services and meet the commitments to our customers regarding availability and performance. Manage end-to-end project plans and ensure on-time delivery. Communicate the status and big picture to the project team and management. Work with business and engineering teams to identify scope, constraints, dependencies, and risks. Identify risks and opportunities across the business and guide solutions. What You'll Need to be Successful Bachelor's Engineering degree in Computer Science or a related field. 8+ years of experience of enterprise-class experience with large-scale cloud solutions in data science/analytics projects and engineering projects. Expert level experience in PowerBI, SQL and Snowflake Experience with data visualization, Python, Data Modeling and data storytelling. Experience architecting complex data marts applying DBT. Architect and build data solutions that use data quality and anomaly detection best practices. Experience building production analytics using the Snowflake data platform. Experience in AWS and Snowflake tools and services Good to have: Certificate in Snowflake is plus Relevant certifications in data warehousing or cloud platform. Experience architecting complex data marts applying DBT and Airflow.
Posted 1 month ago
7.0 - 12.0 years
13 - 23 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Hybrid
We are eagerly seeking candidates with 5 to 13 years experience for a Data Engineer / Lead, to join our dynamic team. The ideal candidate will play a pivotal role within the team to who is a skilled professional with exposure to Python, Spark, Hive, AWS. You will collaborate with internal teams to design, develop, deploy, and maintain software applications at scale Role: Data Engineer / Lead Location: PAN India Experience: 5 to 13 years Job type: Full time Work type: Hybrid Data Engineer with minimum 5 years of relevant professional experience • Should have expertise in Python Scripting and Big data technologies like Spark, Hive, Presto etc. • Experience with AWS services – IAM, EC2, S3, EMR, Lambda Functions, Step Functions, CloudWatch, Redshift, Athena, GLUE etc. • Hands-on experience with Databricks. • Proficient writing Spark jobs in Pyspark and Scala. • Experience writing queries with both SQL and NoSQL DB ((Hive, HBase, MongoDB, Elasticsearch, PostgreSQL etc.) • Should have good understanding around python data structures including data frames, datasets, RDD’s etc. • Experience in ML – integration of ML models • Experience with Data profiling, data migration. • Developing Hive UDF and Hive jobs • Proven hands-on Software Development experience • Experience with test-driven development • Preferred experience in the insurance domain • Must have good understanding of Data warehousing concepts. • Experience using CI/CD tools like GitHub Actions, Jenkins, Azure Devops etc. • Experienced working in Agile projects – Sprint planning, grooming and providing estimations. • Experience using JIRA, Confluence, VS Code or similar IDE’s, Jupyter notebooks etc. • Good communication and collaborative skills with internal and external teams • Flexibility and ability to work in onshore/offshore model involving multiple agile teams • Mentor and guide junior developers, review code, familiar with estimation techniques using story points • Strong analytical and problem-solving skills Qualification you must require: Bachelors or master’s with Computer Science or related field
Posted 1 month ago
4.0 - 7.0 years
9 - 12 Lacs
Pune
Hybrid
So, what’s the role all about? In NiCE as a Senior Software professional specializing in designing, developing, and maintaining applications and systems using the Java programming language. They play a critical role in building scalable, robust, and high-performing applications for a variety of industries, including finance, healthcare, technology, and e-commerce How will you make an impact? Working knowledge of unit testing Working knowledge of user stories or use cases Working knowledge of design patterns or equivalent experience. Working knowledge of object-oriented software design. Team Player Have you got what it takes? Bachelor’s degree in computer science, Business Information Systems or related field or equivalent work experience is required. 4+ year (SE) experience in software development Well established technical problem-solving skills. Experience in Java, spring boot and microservices. Experience with Kafka, Kinesis, KDA, Apache Flink Experience in Kubernetes operators, Grafana, Prometheus Experience with AWS Technology including (EKS, EMR, S3, Kinesis, Lambda’s, Firehose, IAM, CloudWatch, etc) You will have an advantage if you also have: Experience with Snowflake or any DWH solution. Excellent communication skills, problem-solving skills, decision-making skills Experience in Databases Experience in CI/CD, git, GitHub Actions Jenkins based pipeline deployments. Strong experience in SQL What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 6965 Reporting into: Tech Manager Role Type: Individual Contributor
Posted 1 month ago
8.0 - 13.0 years
10 - 15 Lacs
Pune
Work from Office
What You'll Do We are looking for a Senior Site Reliability Engineer (SRE) with a background in software engineering and database engineering to join our growing SRE team. This role is ideal for engineers who are passionate about building scalable systems, automating operational processes, and ensuring system availability, and performance across complex distributed services. As a Senior SRE, you will help design and implement database infrastructure solutions that support our production environments, improve deployment pipelines, and ensure seamless application delivery. Your unique blend of development expertise and database experience will be necessary for overseeing projects across reliability, observability, and performance tuning. Candidate will report to Sr. Manager, Reliability Engineering. What Your Responsibilities Will Be Design scalable and resilient database infrastructure for mission-critical systems. Maintain CI/CD pipelines and automated operational processes using tools like Gitlab, Terraform. Implement observability best practices including logging, monitoring, tracing, and alerting (e.g., Prometheus, Grafana, Loki). Collaborate with development teams to ensure system designs are scalable, maintainable, and secure. Manage and increase relational and non-relational databases (e.g., PostgreSQL, MySQL, MongoDB , Snowflake & Kafka) with a focus on high availability and performance tuning. Lead root cause analysis and postmortems for major incidents; promote long-term reliability improvements and contribute to internal tooling, automation framework and infrastructure-as-code. Good Exposure on Frontend: Angular , ReactJS , Backend: Python Flask & General Skills: UI/UX Design Principles, Version Controls. Participate in database on-call rotations to respond to system incidents, ensure uptime Service level agreements are met and promote DB SRE best practices across teams. What You'll Need to be Successful 8+ years of experience in Software engineering, DevOps, or DB SRE roles. Programming experience with Angular , ReactJS & Python Flask. Experience in database engineering: Schema design, Query optimization, replication, and backup/restore strategies and other Database Administration tasks. Expertise with containerization (Docker) and orchestration platforms (Kubernetes). Experience with distributed systems, networking, and cloud-native architectures (AWS & GCP) Familiarity with security practices related to infrastructure and data handling. Experience with infrastructure-as-code tools (Terraform, etc.). Experience building scalable, resilient, and observable distributed systems and work independently.
Posted 1 month ago
8.0 - 13.0 years
10 - 15 Lacs
Pune
Work from Office
What You'll Do The Global Analytics and Insights (GAI) team is seeking an experienced and experienced Data Visualization Manager to lead our data-driven decision-making initiatives. The ideal candidate will have a background in Power BI, expert-level SQL proficiency, to drive actionable insights and demonstrated leadership and mentoring experience, and an ability to drive innovation and manage complex projects. You will become an expert in Avalara's financial, marketing, sales, and operations data. This position will Report to Senior Manager What Your Responsibilities Will Be You will define and execute the organization's BI strategy, ensuring alignment with business goals. You will Lead, mentor, and manage a team of BI developers and analysts, fostering a continuous learning. You will Develop and implement robust data visualization and reporting solutions using Power BI. You will Optimize data models, dashboards, and reports to provide meaningful insights and support decision-making. You will Collaborate with business leaders, analysts, and cross-functional teams to gather and translate requirements into actionable BI solutions. Be a trusted advisor to business teams, identifying opportunities where BI can drive efficiencies and improvements. You will Ensure data accuracy, consistency, and integrity across multiple data sources. You will Stay updated with the latest advancements in BI tools, SQL performance tuning, and data visualization best practices. You will Define and enforce BI development standards, governance, and documentation best practices. You will work closely with Data Engineering teams to define and maintain scalable data pipelines. You will Drive automation and optimization of reporting processes to improve efficiency. What You'll Need to be Successful 8+ years of experience in Business Intelligence, Data Analytics, or related fields. 5+ Expert proficiency in Power BI, including DAX, Power Query, data modeling, and dashboard creation. 5+ years of strong SQL skills, with experience in writing complex queries, performance tuning, and working with large datasets. Familiarity with cloud-based BI solutions (e.g., Azure Synapse, AWS Redshift, Snowflake) is a plus. Should have understanding of ETL processes and data warehousing concepts. Strong problem-solving, analytical thinking, and decision-making skills.
Posted 1 month ago
3.0 - 8.0 years
19 - 25 Lacs
Bengaluru
Work from Office
About Zscaler Serving thousands of enterprise customers around the world including 40% of Fortune 500 companies, Zscaler (NASDAQ: ZS) was founded in 2007 with a mission to make the cloud a safe place to do business and a more enjoyable experience for enterprise users. As the operator of the world’s largest security cloud, Zscaler accelerates digital transformation so enterprises can be more agile, efficient, resilient, and secure. The pioneering, AI-powered Zscaler Zero Trust Exchange™ platform, which is found in our SASE and SSE offerings, protects thousands of enterprise customers from cyberattacks and data loss by securely connecting users, devices, and applications in any location. Named a Best Workplace in Technology by Fortune and others, Zscaler fosters an inclusive and supportive culture that is home to some of the brightest minds in the industry. If you thrive in an environment that is fast-paced and collaborative, and you are passionate about building and innovating for the greater good, come make your next move with Zscaler. Zscaler has an incredible story to tell, and our Marketing team is committed to sharing it in compelling and expressive ways. Our storytellers, analysts, strategists, and designers are attentive and dedicated to teaching our audience to think about cybersecurity like they never have before. You’ll collaborate with diverse, creative people around the globe to hone the Zscaler brand, increase awareness and demand, support partnerships, and drive home big wins for the world’s cloud security leader and our customers worldwide. We're looking for a Sr. Analyst, Marketing Strategy & Analytics to join our team based in Bangalore. Reporting to the Senior Manager of Marketing Analytics, you will be responsible for: Analyzing data from various sources to produce actionable marketing insights Evaluating marketing campaigns by measuring metrics like lead conversion rates, pipeline generation, ROI, and RoAS Optimizing or creating Tableau dashboards to track marketing campaign performance and inform strategies Developing regular metric reports on campaign performance with improvement hypotheses Integrating account-level insights from multiple sources including marketing automation tools, CRM systems (SFDC), and Snowflake What We're Looking for (Minimum Qualifications) Bachelor’s degree with 3-5 years in analytics, data analysis, or consulting Proficiency in aggregating, organizing, and analyzing large datasets using SQL and Snowflake Understanding of marketing technology platforms like Google Analytics, Salesforce CRM, Marketo, and Segment Experience with visualization tools like Salesforce and Tableau, focusing on design and user experience What Will Make You Stand Out (Preferred Qualifications) Hands-on experience with SFDC, Marketo, Snowflake, Python, and R 1-2 years in data science for experimental designs and hypothesis testing Experience with data pipeline tools for automating data extraction from various platforms and APIs #LI-Hybrid #LI-AM7 At Zscaler, we are committed to building a team that reflects the communities we serve and the customers we work with. We foster an inclusive environment that values all backgrounds and perspectives, emphasizing collaboration and belonging. Join us in our mission to make doing business seamless and secure. Our Benefits program is one of the most important ways we support our employees. Zscaler proudly offers comprehensive and inclusive benefits to meet the diverse needs of our employees and their families throughout their life stages, including: Various health plans Time off plans for vacation and sick time Parental leave options Retirement options Education reimbursement In-office perks, and more! By applying for this role, you adhere to applicable laws, regulations, and Zscaler policies, including those related to security and privacy standards and guidelines. Zscaler is committed to providing equal employment opportunities to all individuals. We strive to create a workplace where employees are treated with respect and have the chance to succeed. All qualified applicants will be considered for employment without regard to race, color, religion, sex (including pregnancy or related medical conditions), age, national origin, sexual orientation, gender identity or expression, genetic information, disability status, protected veteran status, or any other characteristic protected by federal, state, or local laws. See more information by clicking on the Know Your Rights: Workplace Discrimination is Illegal link. Pay Transparency Zscaler complies with all applicable federal, state, and local pay transparency rules. Zscaler is committed to providing reasonable support (called accommodations or adjustments) in our recruiting processes for candidates who are differently abled, have long term conditions, mental health conditions or sincerely held religious beliefs, or who are neurodivergent or require pregnancy-related support.
Posted 1 month ago
7.0 - 11.0 years
32 - 35 Lacs
Hyderabad
Work from Office
About the job We are seeking an experienced Data Engineering Specialist interested in challenging the status quo to ensure the seamless creation and operation of the data pipelines that are needed by Sanofis advanced analytic, AI and ML initiatives for the betterment of our global patients and customers. Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions, to accelerate R&D, manufacturing and commercial performance and bring better drugs and vaccines to patients faster, to improve health and save lives Main Responsibilities: Establish technical designs to meet Sanofi requirements aligned with the architectural and Data standards Ownership of the entire back end of the application, including the design, implementation, test, and troubleshooting of the core application logic, databases, data ingestion and transformation, data processing and orchestration of pipelines, APIs, CI/CD integration and other processes Fine-tune and optimize queries using Snowflake platform and database techniques Optimize ETL/data pipelines to balance performance, functionality, and other operational requirements. Assess and resolve data pipeline issues to ensure performance and timeliness of execution Assist with technical solution discovery to ensure technical feasibility. Assist in setting up and managing CI/CD pipelines and development of automated tests Developing and managing microservices using python Conduct peer reviews for quality, consistency, and rigor for production level solution Design application architecture for efficient concurrent user handling, ensuring optimal performance during high usage periods Own all areas of the product lifecycle: design, development, test, deployment, operation, and support About you Qualifications: 5+ years of relevant experience developing backend, integration, data pipelining, and infrastructure Expertise in database optimization and performance improvement Expertise in Python, PySpark, and Snowpark Experience data warehousing and object-relational database (Snowflake and PostgreSQL) and writing efficient SQL queries Experience in cloud-based data platforms (Snowflake, AWS) Proficiency in developing robust, reliable APIs using Python and FastAPI Framework Expert in ELT and ETL & experience working with large data sets and performance and query optimization. IICS is a plus Understanding of data structures and algorithms Understanding of DBT is a plus Experience in modern testing framework (SonarQube, K6 is a plus) Strong collaboration skills, willingness to work with others to ensure seamless integration of the server-side and client-side Knowledge of DevOps best practices and associated tools is a plus, especially in the setup, configuration, maintenance, and troubleshooting of associated tools: Containers and containerization technologies (Kubernetes, Argo, Red Hat OpenShift) Infrastructure as code (Terraform) Monitoring and Logging (CloudWatch, Grafana) CI/CD Pipelines (JFrog Artifactory) Scripting and automation (Python, GitHub, Github actions) Experience with JIRA & Confluence Workflow orchestration (Airflow) Message brokers (RabbitMQ) Education: Bachelor’s degree in computer science, engineering, or similar quantitative field of study Why choose us? Bring the miracles of science to life alongside a supportive, future-focused team. Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally. Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact. Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks’ gender-neutral parental leave. Opportunity to work in an international environment, collaborating with diverse business teams and vendors, working in a dynamic team, and fully empowered to propose and implement innovative ideas. Pursue Progress . Discover Extraordinary . Progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. You can be one of those people. Chasing change, embracing new ideas and exploring all the opportunities we have to offer. Let’s pursue progress. And let’s discover extraordinary together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com! Languages: English is a must
Posted 1 month ago
5.0 - 10.0 years
15 - 25 Lacs
Pune, Chennai, Coimbatore
Work from Office
About the Role: We are seeking a skilled ETL (Extract, Transform, Load) Tester to join our team. The ETL Tester will be responsible for ensuring the quality of data extraction, transformation, and loading processes within the data warehouse environment. The ideal candidate will have a strong understanding of ETL processes, experience with testing large datasets, and the ability to work closely with development and business intelligence teams to ensure the reliability and accuracy of data. Key Responsibilities: Understanding business requirements and ETL specifications to develop test plans and test cases. Executing test cases to ensure accurate extraction, transformation, and loading of data from source systems to the data warehouse. Verifying data integrity and accuracy throughout the ETL process. Identifying and reporting data quality issues and working with the development team to address them. Collaborating with business analysts, ETL developers, and other team members to ensure comprehensive testing coverage. Participating in cross-functional meetings to provide testing insights and contribute to process improvements. Documenting test results, issues, and resolution steps for future reference. Qualifications: Bachelors degree in Computer Science, Information Systems, or a related field. Proven experience in ETL testing, preferably in the insurance domain. Familiarity with ETL tools such as Informatica, Talend, or SSIS. Strong SQL skills for data validation and verification. Knowledge of data warehousing concepts and methodologies. Understanding of insurance industry processes, data models, and regulatory requirements. Ability to work effectively in a team environment and communicate complex technical concepts clearly.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31300 Jobs | Dublin
Wipro
16502 Jobs | Bengaluru
EY
10539 Jobs | London
Accenture in India
10399 Jobs | Dublin 2
Uplers
8481 Jobs | Ahmedabad
Amazon
8475 Jobs | Seattle,WA
IBM
7957 Jobs | Armonk
Oracle
7438 Jobs | Redwood City
Muthoot FinCorp (MFL)
6169 Jobs | New Delhi
Capgemini
5811 Jobs | Paris,France