Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 7.0 years
14 - 18 Lacs
Pune
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 month ago
6.0 - 7.0 years
14 - 18 Lacs
Bengaluru
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 month ago
8.0 - 13.0 years
10 - 14 Lacs
Hyderabad
Work from Office
#Employment Type: Contract Skills Azure Data Factory SQL Azure Blob Azure Logic Apps
Posted 1 month ago
2.0 - 6.0 years
2 - 6 Lacs
Vadodara, Gujarat, India
On-site
Internal Job Title: Data Pipeline Analyst Business: Lucy Electric Manufacturing & Technologies India Location: Halol, Vadodara, Gujarat Job Reference No: 3940 Job Purpose: To support the provision of key business insights by building and maintaining data pipelines and structures, using programming tools and languages including Python and MS SQL. Job Context: Working closely with the Data & Analytics Development Lead and cross-functional teams to ensure a coordinated approach to Business Intelligence delivery. The role involves providing information across multiple businesses for comparative and predictive analysis, highlighting opportunities for business process improvement. Job Dimensions: The role is an onsite role, with flexible attendance at our office in Vadodara, India, to support business engagement. There is an occasional need to visit other sites and business partners at their premises to build stakeholder relationships or to attend specific industry events, globally. Key Accountabilities: Capturing requirements and preparing specifications for data pipelines and reporting Developing prioritised BI outputs to agreed quality and security standards Assisting the Data & Analytics team with technical integration of data sources Conducting training and coaching sessions to support business users understanding of data Collaborating with the wider business to promote appropriate use of data & analytics tools Maintaining operational and customer-facing documentation for support processes and defined project deliverables Improving analytics capabilities for BI services in an evergreen ecosystem Troubleshooting production issues and coordinating with the wider IT Team to resolve incidents and complete tasks using IT Service Management tools, as part of a cross-functional team Qualifications, Experience & Skills: A bachelor's degree (or equivalent professional qualifications and experience) in a relevant stream Effective communication skills in English 4 years of experience in data transformation and/or creating data pipelines, including Python Good understanding of Microsoft data storage tools such as Azure Blob and Data Lake Working knowledge of statistical methods to validate findings, ensure data accuracy and drive data-driven decision making Knowledge of Exploratory Data Analysis to identify key insights, potential issues, and areas for further investigation Conduct design reviews, propose enhancements, and design best-fit solutions Provide Business as Usual support (BAU) and support ad-hoc reporting requirements alongside project work Identify process improvement and efficiency opportunities through data analysis and recommend automation or optimisation General understanding of a company's value chain and basic manufacturing industry terminology Good to Have Skills: ETL using Data Pipeline tools (e.g., SSIS, Azure Data Factory or similar) Microsoft SQL, Dynamics 365 (D365), Microsoft Dataverse REST APIs, CI/CD on Azure DevOps Data Quality, Data Sensitivity, Near Time and Real Time data processing Behavioral Competencies: Good interpersonal skills to enable process improvement through positive interaction Problem-solving mindset with a desire to share knowledge and support others Customer-oriented, flexible, and focused on stakeholder satisfaction Does this sound interesting We would love to hear from you. Our application process is quick and easy. Apply today!
Posted 1 month ago
0.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant - Snowflake! Responsibilities: . Ability to design and implement effective analytics solutions and models with Snowflake . Hand-on experience in Snowflake SQL, Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data . Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. . Should be able to implement Snowpipe, Stage and file upload to Snowflake database . Hand-on Experience on any RBDMS/NoSQL database with strong SQL writing skills . In-depth understanding of Data Warehouse/ODS, ETL concept and modeling structure principles . Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling. . Hands-on Experience on Azure Blob Qualifications we seek in you! Minimum Qualifications / Skills . SnowSQL, SnowPipe, Tasks, Streams, Time travel . Certified SnowPro Core . Good Understanding of Data Warehousing & Reporting tools . Able to work on own initiative and as a team player . Good organizational skills with cultural awareness and sensitivity . Education: ME/ M.Tech./ MS (Engg/ Sciences) and BE/BTech (Engineering) . Industry: Manufacturing/Industrial Behavioral Requirements: . Lives client&rsquos core values of courage and curiosity to deliver the best business solutions for EL-Business . Ability to o work in diversified teams o convey messages and ideas clearly to the users and project members o listen, understand, appreciate, and appropriately respond to the users . Excellent team player with strong oral and written communication skills . Possess strong time management skills . Keeps up-to-date and informed of client technology landscape and client IS Strategy planned or ad-hoc changes. Preferred Skills/Qualifications Azure storage services such as Blob, Data Lake, Cosmos DB and SQL Server. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at and on , , , and . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 month ago
5.0 - 8.0 years
7 - 11 Lacs
Chennai
Work from Office
Cloud Migration Specialist Chennai Rates including mark up - 170K/ No Of position - 3 Mandatory skills -AWS and Azure Cloud migrations, on-prem applications to AWS/Azure Experience - 5 - 8 years Position Overview: We are seeking a skilled Cloud Engineer with expertise in AWS and Azure Cloud migrations. The ideal candidate will lead the migration of on-premises applications to AWS/Azure, optimize cloud infrastructure, and ensure seamless transitions. Key Responsibilities: Plan and execute migrations of on-prem applications to AWS/Azure. Utilize or Develop migration tools for large-scale application migrations. Design and implement automated application migrations. Collaborate with cross-functional teams to troubleshoot and resolve migration issues. Qualifications: 5+ years of AWS/Azure cloud migration experience. Proficiency in Cloud compute (EC2, EKS, Azure VM, AKS) and Storage (s3, EBS,EFS, Azure Blob, Azure Managed Disks, Azure Files). Strong knowledge of AWS and Azure cloud services and migration tools. Expert in terraform. AWS/Azure certification preferred. Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Deliver NoPerformance ParameterMeasure1Operations of the towerSLA adherence Knowledge management CSAT/ Customer Experience Identification of risk issues and mitigation plans Knowledge management2New projectsTimely delivery Avoid unauthorised changes No formal escalations Mandatory Skills: Cloud Azure Admin. Experience5-8 Years.
Posted 2 months ago
0.0 - 2.0 years
1 - 1 Lacs
Bengaluru
Work from Office
Join us as a Software Dev Intern! Work on Next.js, React, Node.js, and PostgreSQL to build scalable apps, APIs, and user-friendly UIs. Collaborate, write clean code, and help shape real features in a fast-paced environment.
Posted 2 months ago
6.0 - 7.0 years
8 - 9 Lacs
Bengaluru
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 2 months ago
6.0 - 7.0 years
8 - 9 Lacs
Pune
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences.
Posted 2 months ago
6.0 - 7.0 years
8 - 9 Lacs
Pune
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences.
Posted 2 months ago
8.0 - 13.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Req ID: 317928 We are currently seeking a Java Backend Developer - Digital Engineering Lead Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Results-Based Lead Software Engineer (Product Development) Other Comparable titles SaaS Backend Developer Overview As a Backend Developer/ Architect, you need to participate in estimating, technical design, implementation, documentation, testing, deployment and support of applications developed for our clients. As a member working in a team environment, you will work with solution architects to interpret/translate written business requirements into technical design/code. Scope Core responsibilities to include building backend Rest API Services based on Spring Boot deployed in a SaaS environment The dev team currently comprises of 10+ global associates across US and India (COE) Our current technical environment Software Spring Boot Microservices, Building Portal component, Azure SQL, Spock groovy Application Architecture Service deployed on Azure Frameworks/Others KAFKA , GitHub, CI/CD, Java, J2EE, Docker, Kubernetes Experience on SaaS What you"™ll do Development of REST API in a Microservices architecture (Spring Boot) and deployed on Microsoft"™s Azure platform. The architecture includes technology components such as ReactJS and JavaScript/Typescript (UI), Spring Boot (Backend), Azure SQL, Azure Blob, Azure Logic Apps, Portal and Supply Chain planning software Be a senior member of a highly skilled team seeking systematic approaches to improve engineering productivity, efficiency, effectiveness, and quality Support our existing customer base with the newer enhancements/ defects fixing Create technical documentation Provide early visibility and mitigation to technical challenges through the journey. Confidently represents product and portfolio What we are looking for Bachelor"™s degree (STEM preferred) and minimum 8+ years of experience in Software development; ideally a candidate that has started as a Software Engineer and progressed to Lead Software Engineer Strong experience in programming and problem solving Hands-on development skills along with design experience; should not have moved away from software development Experience in building products with an API first approach in a SaaS environment Required Skills Java, Spring Boot, SQL Preferred Skills Knowledge of Public Clouds (Azure, AWS etc.), Spring Cloud, Docker, Kubernetes Experience in Supply Chain domain is a plus Good Understanding of secure architectures, secure configuration, identity management, role-based access control, authentication & authorization, and data encryption.
Posted 2 months ago
6.0 - 7.0 years
5 - 9 Lacs
Navi Mumbai
Work from Office
Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 2 months ago
6.0 - 7.0 years
14 - 18 Lacs
Kochi
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 2 months ago
3.0 - 6.0 years
12 - 20 Lacs
Pune
Hybrid
Required Data Engineer for our client company Product base IT Company Job location Pune Salary upto 20 LPA Exp. 3 to 6 years Need immediate joiner only work mode hybrid apply here Required Candidate profile Required Candidate profile candidate should be good in Communication need immediate joiners only (within 15 Days) Excellent communication skill must
Posted 2 months ago
5 - 10 years
22 - 25 Lacs
Hyderabad
Work from Office
Networking Architecture: Design and deploy network architectures on Azure, including hybrid and multi-cloud solutions. Configure Azure networking components such as VNets, VPN gateways, ExpressRoute, and Azure Route Server. Implement network segmentation and micro-segmentation to enhance security and traffic isolation. Deploy load balancers, application gateways, and traffic managers for scalable solutions. Ensure high availability and disaster recovery for networking components. Server Administration: Design and implement virtual machines (VMs) and scale sets in Azure. Configure and manage Windows and Linux servers in Azure. Implement Azure Auto-Scaling and Load Balancing solutions for virtual machines. Oversee server patch management, monitoring, and backup strategies. Storage Solutions Design: Architect and manage Azure storage services, including Azure Blob, Azure Files, and Azure Disks. Design and implement backup and recovery solutions using Azure Backup and Azure Site Recovery. Optimize storage solutions for cost and performance while ensuring security compliance. Configure storage accounts, access tiers, and redundancy options (LRS, GRS, ZRS). Optimization and Troubleshooting: Monitor and optimize Azure resources for performance, scalability, and cost-effectiveness. Troubleshoot complex networking, server, and storage issues in Azure environments. Use tools like Azure Monitor, Network Watcher, and Log Analytics for proactive issue identification
Posted 3 months ago
8 - 13 years
12 - 22 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
Greetings of The Day...!!! We have an URGENT on-rolls opening for the position of "Snowflake Architect" at One of our reputed clients for WFH. Name of the Company - Confidential Rolls - Onrolls Mode of Employment - FTE / Sub-Con / Contract Job Location - Remote Job Work Timings Night Shift – 06.00 pm to 03.00 am IST Nature of Work – Work from Home Working Days – 5 Days Weekly Educational Qualification - Bachelor's degree in computer science, BCA, engineering, or a related field. Salary – Maximum CTC Would be 23LPA (Salary & benefits package will be commensurate with experience and qualifications, PF, Medical Insurance cover available) Language Known - English, Hindi, & local language. Experience – 9 Years + of relevant experience in the same domain. Job Summary: We are seeking a highly skilled and experienced Snowflake Architect to lead the design, development, and implementation of scalable, secure, and high-performance data warehousing solutions on the Snowflake platform. The ideal candidate will possess deep expertise in data modelling, cloud architecture, and modern ELT frameworks. You will be responsible for architecting robust data pipelines, optimizing query performance, and ensuring enterprise-grade data governance and security. In this role, you will collaborate with data engineers, analysts, and business stakeholders to deliver efficient data solutions that drive informed decision-making across the organization. Key Responsibilities: Manage and maintain the Snowflake platform to ensure optimal performance and reliability. Collaborate with data engineers and analysts to design and implement data pipelines. Develop and optimize SQL queries for efficient data retrieval and manipulation. Create custom scripts and functions using JavaScript and Python to automate platform tasks. Troubleshoot platform issues and provide timely resolutions. Implement security best practices to protect data within the Snowflake platform. Stay updated on the latest Snowflake features and best practices to continuously improve platform performance. Required Qualifications: Bachelor’s degree in computer science, Engineering, or a related field. Minimum of Nine years of experience in managing any Database platform. Proficiency in SQL for data querying and manipulation. Strong programming skills in JavaScript and Python. Experience in optimizing and tuning Snowflake for performance. Preferred Skills: Technical Expertise Cloud & Integration Performance & Optimization Security & Governance Soft Skills THE PERSON SHOULD BE WILLING TO JOIN IN 07-10 DAYS TIME OR IMMEDIATE JOINER. Request for interested candidates; Please share your updated resume with us below Email-ID executivehr@monalisammllp.com, also candidate can call or WhatsApp us at 9029895581. Current /Last Net in Hand - Salary will be offered based on the interview /Technical evaluation process -- Notice Period & LWD was/will be - Reason for Changing the job - Total Years of Experience in Specific Field – Please specify the location which you are from – Do you hold any offer from any other association - ? Regards, Monalisa Group of Services HR Department 9029895581 – Call / WhatsApp executivehr@monalisammllp.com
Posted 3 months ago
5.0 - 10.0 years
3 - 7 Lacs
hyderabad
Work from Office
Job Purpose At Intercontinental Exchange (ICE), we engineer technology, exchanges and clearing houses that connect companies around the world to global capital and derivative markets. With a leading-edge approach to developing technology platforms, we have built market infrastructure in all major trading centers, offering customers the ability to manage risk and make informed decisions globally. By leveraging our core strengths in technology, we continue to identify new ways to serve our customers and transform global markets. AIP Suites (Data Modernization to Snowflake) builds an analytics-ready data architecture where data from source systems such as PDM (Product Data Management) and RDO is ingested into Snowflake for centralized storage and modeling. These models support ICE BI, which consumes Snowflake data for analytics and dashboarding. This design ensures clean separation between raw ingestion, transformation, analytics, and service-based consumption, supporting scalable and future-proof data-driven operations. ICE Mortgage Technology is seeking a Data Engineer who will be responsible for design and optimize SQL queries, develop stored procedures, and participate in the migration and modernization of legacy applications to support IMT (ICE Mortgage Technology) Products. The candidate should have a strong background in SQL and Stored Procedures Responsibilities Provides Snowflake-based data warehouse design and development for projects involving new data integration, migration, and enhancement of existing pipelines. Designs and develops data transformation logic using SQL, Snowflake stored procedures, and Python-based scripts for ETL/ELT workloads. Builds and maintains robust data pipelines to support reporting, analytics, and application data needs. Creates and maintains Snowflake objects like tables, views, streams, tasks, file formats, and external stages. Participates in project meetings with data engineers, analysts, business users, and product owners to understand and implement technical requirements. Writes technical design documentation based on business requirements and data architecture principles. Develops and/or reviews unit testing protocols for SQL scripts, procedures, and data pipelines using automation frameworks. Completes documentation and procedures for pipeline deployment, operational handover, and monitoring. May mentor or guide junior developers and data engineers. Stays current with Snowflake features, best practices, and industry trends in cloud data platforms. Performs additional related duties as assigned. Knowledge and Experience Bachelors Degree or the equivalent combination of education, training, or work experience. 5+ years of professional experience in data engineering or database development. Strong Hands-on experience: Writing complex SQL queries and stored procedures Database stored procedures, functions, views, and schema design Using Streams, Tasks, Time Travel, and Cloning Proficiency in database performance tuning and performance optimization clustering, warehouse sizing, caching, etc. Experience configuring external stages to integrate with cloud storage (AWS S3, Azure Blob, etc.). Experience writing Python/Shell scripts for data processing (where needed). Knowledge on Snowflake and Tidal is an added advantage Proficiency in using Git and working within Agile/Scrum SDLC environments. Familiarity working in a Software Development Life Cycle (SDLC) leveraging Agile principles. Excellent analytical, decision-making, and problem-solving skills. Ability to multitask in a fast-paced environment with a focus on timeliness, documentation, and communication with peers and business users. Strong verbal and written communication skills to engage both technical and non-technical audiences at various organizational levels.
Posted Date not available
5.0 - 9.0 years
15 - 20 Lacs
pune
Work from Office
Pune, India Java Full Stack BCM Industry 09/05/2025 Project description In Securities Operations IT, we are looking for individuals who are passionate about what they do, who value excellence, learning and integrity. Our culture places emphasis on team work, collaboration and delivering business value while working at a sustainable pace. Most importantly we are looking for someone who is technically excellent and can aspire, and inspire others, to our values and culture. Responsibilities Design, develop, and maintain microservices using Spring Boot and Java. Implement ReactJS front-end components, ensuring responsiveness and performance. Build and maintain complex data-driven applications using MSSQL, PostgreSQL, and Redis for caching. Integrate with Apache Kafka and Apache Flink for event-stream processing and real-time data analytics. Work with Flowable BPMN, CMMN, and Decision Tables for business process automation. Write comprehensive unit and integration tests using JUnit and TestContainers for backend services. Implement UI testing with Jest and React Test for front-end components. Develop and manage cloud-based services using Azure Cloud, including Azure Blob Containers. Automate CI/CD pipelines using GitLab for continuous integration and deployment. Optimize performance, scalability, and security of applications and services. Collaborate with cross-functional teams to define system requirements and deliver solutions. Skills Must have Strong proficiency in ReactJS and modern front-end development practices. Hands-on experience in unit testing with JUnit and TestContainers, as well as UI testing using Jest and React Test. Experience with Azure Cloud services, particularly Azure Blob Storage and cloud computing best practices. Strong caching experience with Redis. Knowledge of CI/CD pipelines and automation using GitLab. Good understanding of distributed systems and message-oriented middleware. Nice to have Familiarity with Ververica Platforms for stream processing. Experience in Kubernetes and Docker for containerized application deployment. Familiarity with other cloud platform Azure. Other Languages EnglishB2 Upper Intermediate Seniority Senior
Posted Date not available
5.0 - 8.0 years
10 - 14 Lacs
chennai
Work from Office
Mandatory Skills Terraform modules Devops, AWS and Azure Years of exp needed Minimum of 8 Years Work Location pls mention city and preferably office address as well Chennai Rates including mark up - 170 K/M Cloud Platform Engineer Chennai Band B3 No. of position - 3 Position Overview: Cloud Platform Engineer will be responsible for developing and maintaining Terraform modules and patterns for AWS and Azure. These modules and patterns will be used for platform landing zones, application landing zones, and application infrastructure deployments. The role involves managing the lifecycle of these patterns, including releases, bug fixes, feature integrations, and updates to test cases. Key Responsibilities: Develop and release Terraform modules, landing zones, and patterns for AWS and Azure. Provide lifecycle support for patterns, including bug fixing and maintenance. Integrate new features into existing patterns to enhance functionality. Release updated and new patterns to ensure they meet current requirements. Update and maintain test cases for patterns to ensure reliability and performance. Qualifications: 5+ years of AWS/Azure cloud migration experience. Proficiency in Cloud compute (EC2, EKS, Azure VM, AKS) and Storage (s3, EBS,EFS, Azure Blob, Azure Managed Disks, Azure Files). Strong knowledge of AWS and Azure cloud services. Expert in terraform. AWS/Azure certification preferred. Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Incase of performance issues, take necessary action with zero tolerance for will based performance issues Ensure that organizational programs like Performance Nxtarewell understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring 100% error free on boarding & implementation 2. CSAT Manage service tools Troubleshoot queries Customer experience 3. Capability Building & Team Management % trained on new age skills, Team attrition %, Employee satisfaction score Mandatory Skills: Cloud AWS Devops. Experience5-8 Years.
Posted Date not available
6.0 - 11.0 years
6 - 10 Lacs
bengaluru
Work from Office
We are seeking a highly skilled ‘Storage Software’ product Development Engineer with over 6 years of expertise in software product development, design, and support, with a strong preference for experience in the storage domain and in data protection. The ideal candidate will be member of seasoned software engineers team, with hands-on responsibility for design, implementation, support and optimization of storage solutions to ensure high performance, stability, and reliability of IBM storage software.Software Development and Maintenance: Contribute in design, development and implementation of C/C++ software components for storage/backup products, ensuring adherence to coding standards, best practices, and performance guidelines. Contribute tothe release of high-quality products on schedule. Mentor and guide team members to maintain coding excellence and productivity. Product Support: Provide technical expertise and support to customers and internal stakeholders regarding product inquiries and issues. Drive customer feedback and ideas into the product roadmap, ensuring timely and effective delivery. Manage incidents, tickets, problems and escalations from customers with innovative solutions. Collaborate with product management to align technical solutions with business goals. Troubleshooting and Debugging: Investigate and resolve complex software issues, Excellent debugging, core-dump analysis skills. Working closely with cross-functional teams to identify root causes and implement effective solutions. System Programming Skills: To debug system software, analyse for performance bottle neck and remediate to improve overall system performance. Storage/Backup Skills: In-depth knowledge of storage and backup technologies, such as tape libraries, disk-based backup, and cloud-based solutions. Utilize comprehensive knowledge of system-level programming to optimize storage solutions. Continuous Improvement: Stay updated with the latest advancements in database technologies, C/C++ development practices, and software design principles. Recommend and implement improvements to enhance product performance and maintainability Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise Bachelor's or Master’s degree in Computer Science, Software Engineering, or a related discipline. Extensive over 6 years of experience primarily in C/C++ system software development, including product development and support. And, exposure to other programming languages, development / test environments, and related diagnostic tools. Proven technical leadership and team management experience. Excellent problem-solving and debugging skills to analyze and resolve complex technical issues. Strong expertise in systems and storage software development. Proven experience as a Data Backup Domain Developer or similar role. Proven experience in optimizing multithreaded environment. Development experience with cloud object storage providers such asAWS S3, Azure Blob Store, GoogleCloud Storage Experience with storage systems, backup environments are highly desirable. Solid understanding of software design principles and best practices. Strong communication and collaboration skills to work effectively in a team-oriented environment.
Posted Date not available
10.0 - 12.0 years
6 - 9 Lacs
chennai, bengaluru
Work from Office
Location: Bangalore, Chennai Extract data from source system using Data Factory pipelines Massaging and Cleansing the data Transform data based on business rules Expose the data for reporting needs and exchange data with downstream applications. Standardize the various integration flows (e.g decom ALDML Init integration, simplify ALDML Delta integration).
Posted Date not available
10.0 - 15.0 years
11 - 16 Lacs
gurugram, bengaluru
Work from Office
Shift: 24/7 (Monthly Rotation) Remote Work From Home The Rubrik Backup Engineer IV is a senior technical specialist responsible for the design, implementation, troubleshooting, and optimisation of enterprise backup and disaster recovery solutions using Rubrik. This individual acts as a technical escalation point and subject matter expert (SME), driving innovation and automation while collaborating with cross-functional teams to ensure data protection strategies meet business requirements. The role demands deep expertise in Rubrik architecture and integration, alongside strong capabilities in automation, scripting, VMware, and modern data protection frameworks. Career Level Summary Recognized expert with specialized depth in enterprise backup and recovery. Leads large-scale initiatives and provides technical direction across teams. Works independently on the most complex issues and initiatives. Coaches and mentors junior engineers and cross-functional peers. Key Responsibilities Serve as the highest level of technical escalation for Rubrik-related incidents and issues. Architect and implement Rubrik backup solutions across hybrid, on-premises, and multi-cloud environments (AWS, Azure, GCP). Lead backup and recovery strategy design sessions for customers, including air-gapped, immutable, and ransomware-resilient architectures. Integrate Rubrik with external systems (e.g., ServiceNow, Splunk, vSphere, Azure AD) using REST APIs and automation tools (Python, Ansible, Terraform). Design and maintain Rubrik SLA Domains, archival policies (cloud/tape), replication, and compliance workflows. Collaborate with Engineering, Storage, Security, and Application teams to ensure backup consistency and performance. Manage large-scale Rubrik clusters, capacity planning, and software upgrades. Proactively identify and resolve systemic issues across infrastructure that impact backup performance or restore SLAs. Document architectures, runbooks, and SOPs; contribute to technical training and playbooks. Work closely with stakeholders and leadership on backup audit and regulatory compliance requirements. Provide technical mentoring and guidance to junior engineers and partner teams. Required Skills and Knowledge Expert knowledge of Rubrik CDM architecture, RBS, Polaris, and Rubrik APIs. Advanced skills in backup for virtualized environments (VMware, Hyper-V). Strong understanding of file-level, database-level, and VM-level backup and restore operations. Deep knowledge of cloud-native backups and cloud archiving using AWS S3, Azure Blob, and GCP storage. Hands-on experience with integration and automation (e.g., Python, PowerShell, REST API, Terraform, Ansible). Proficiency in disaster recovery design, planning, and orchestration (DR runbooks). Familiarity with data compliance, encryption, and ransomware defense mechanisms. Exposure to monitoring and reporting platforms (e.g., vROps, Splunk, Grafana). Good knowledge of enterprise infrastructure: storage systems (SAN/NAS), networking, Windows/Linux OS. Excellent documentation and communication skills for both technical and executive-level audiences. Experience / Education Minimum of 10 years of experience in IT infrastructure with at least 4 years of hands-on experience in Rubrik. Proven track record of managing complex enterprise-scale backup environments. Experience with backup and recovery for databases (MSSQL, Oracle), file servers, and virtual machines. Bachelor's degree in Computer Science, Information Technology, or equivalent work experience. Preferred Certifications: Rubrik Certified System Administrator (RCSA) or similar Rubrik platform certifications. VMware VCP, AWS/Azure certifications are a plus.
Posted Date not available
8.0 - 13.0 years
10 - 14 Lacs
hyderabad
Work from Office
#Employment Type: Contract Skills Azure Data Factory SQL Azure Blob Azure Logic Apps
Posted Date not available
6.0 - 7.0 years
8 - 9 Lacs
pune
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences.
Posted Date not available
8.0 - 13.0 years
11 - 15 Lacs
pune
Work from Office
Provide expert-level support for storage systems including SAN, NAS, DAS, object storage, and backup technologies. Design and implement storage solutions based on enterprise requirements (performance, availability, scalability). Perform root cause analysis on complex storage and backup issues, providing permanent resolutions. Lead storage migrations, upgrades, and integration projects. Establish and enforce storage policies, standards, and best practices. Drive automation initiatives for storage provisioning, monitoring, and reporting. Collaborate with infrastructure, application, and cloud teams for seamless storage integration. Maintain documentation for storage architecture, processes, and configurations. Provide capacity planning and performance analysis to ensure optimal performance and cost efficiency. Manage storage vendors and support cases as needed. Technical Skills & Tools: Storage Platforms: NetApp, Dell EMC (VNX, Unity, PowerMax), IBM, HPE 3PAR/Nimble, Pure Storage. SAN Technologies: Brocade, Cisco MDS. Backup Solutions: Commvault, Veeam, NetBackup, Data Domain, Avamar. Protocols: FC, iSCSI, NFS, SMB, CIFS. Scripting & Automation: PowerShell, Ansible, Shell scripting. Monitoring Tools: SolarWinds, Nagios, SNMP-based tools. Cloud Storage: AWS S3/EBS/Glacier, Azure Blob/File, Google Cloud Storage (preferred). Strong understanding of storage tiering, deduplication, compression, snapshots, and replication
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City