Jobs
Interviews

1401 Data Bricks Jobs - Page 48

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 15.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Job Title ARCHITECT - AWS Databricks, SQL Experience 12-15 Years Location Bangalore : ARCHITECT, AWS, Databricks, SQL

Posted 1 month ago

Apply

12.0 - 20.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Job Title Senior Software Engineer Experience 12-20 Years Location Bangalore : Strong knowledge & hands-on experience in AWS Data Bricks Nice to have Worked in hp eco system (FDL architecture) Technically strong to help the team on any technical issues they face during the execution. Owns the end-to-end technical deliverables Hands on data bricks + SQL knowledge Experience in AWS S3, Redshift, EC2 and Lambda services Extensive experience in developing and deploying Bigdata pipelines Experience in Azure data lake Strong hands on in SQL development / Azure SQL and in-depth understanding of optimization and tuning techniques in SQL with Redshift Development in Notebooks (like Jupyter, DataBricks, Zeppelin etc) Development experience in Spark Experience in scripting language like python and any other programming language Roles and Responsibilities Candidate must have hands on experience in AWS Data Databricks Good development experience using Python/Scala, Spark SQL and Data Frames Hands-on with Databricks, Data Lake and SQL knowledge is a must. Performance tuning, troubleshooting, and debugging SparkTM Process Skills: Agile – Scrum Qualification: Bachelor of Engineering (Computer background preferred)

Posted 1 month ago

Apply

10.0 - 12.0 years

9 - 13 Lacs

Chennai

Work from Office

Job Title Data Architect Experience 10-12 Years Location Chennai : 10-12 years experience as Data Architect Strong expertise in streaming data technologies like Apache Kafka, Flink, Spark Streaming, or Kinesis. ProficiencyinprogramminglanguagessuchasPython,Java,Scala,orGo ExperiencewithbigdatatoolslikeHadoop,Hive,anddatawarehousessuchas Snowflake,Redshift,Databricks,MicrosoftFabric. Proficiencyindatabasetechnologies(SQL,NoSQL,PostgreSQL,MongoDB,DynamoDB,YugabyteDB). Should be flexible to work as anIndividual contributor

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Bengaluru

Work from Office

Experience Range : 4 - 12+ Year's Work Location : Bangalore (Proffered ) Must Have Skills : Airflow, big query, Hadoop, PySpark, Spark/Scala, Python, Spark - SQL, Snowflake, ETL, Data Modelling, Erwin OR Erwin Studio, Snowflake, Stored Procedure & Functions, AWS, Azure Databricks, Azure Data Factory. No Of Opening's : 10+ Job Description : We are having multiple Salesforce roles with our clients. Role 1 : Data Engineer Role 2 : Support Data Engineer Role 3 : ETL Support Engineer Role 4 : Senior Data Modeler Role 5 : Data Engineer Data Bricks Please find below the JD's for each role Role 1 : Data Engineer 5+ years of experience in data engineering or a related role. Proficiency in Apache Airflow for workflow scheduling and management. Strong experience with Hadoop ecosystems, including HDFS, MapReduce, and Hive. Expertise in Apache Spark/ Scala for large-scale data processing. Proficient in Python Advanced SQL skills for data analysis and reporting. Experience with cloud platforms (e.g., AWS, Google Cloud, Azure) is a plus. Designs, proposes, builds, and maintains databases and datalakes, data pipelines that transform and model data, and reporting and analytics solutions Understands business problems and processes based on direct conversations with customers, can see the big picture, and translate that into specific solutions Identifies issues early, proposes solutions, and tactfully raises concerns and proposes solutions Participates in code peer reviews Articulates clearly pros/cons of various tools/approaches Documents and diagrams proposed solutions Role 2 : Support Data Engineer Prioritize and resolve Business-As-Usual (BAU) support queries within agreed Service Level Agreements (SLA) while ensuring application stability. Drive engineering delivery to reduce technical debt across the production environment, collaborating with development and infrastructure teams Perform technical analysis of the production platform to identify and address performance and resiliency issues Participate in the Software Development Lifecycle (SDLC) to improve production standards and controls Build and maintain the support knowledge database, updating the application runbook with known tasks and managing event monitoring Create health check monitors, dashboards, synthetic transactions and alerts to increase monitoring and observability of systems at scale. Participate in on-call rotation supporting application release validation, alert response, and incident management Collaborate with development, product, and customer success teams to identify and resolve technical problems. Research and implement recommendations from post-mortem analyses for continuous improvement. Document issue details and solutions in our ticketing system (JIRA and ServiceNow) Assist in creating and maintaining technical documentation, runbooks, and knowledge base articles Navigate a complex system, requiring deep troubleshooting/debugging skills and an ability to manage multiple contexts efficiently. Oversee the collection, storage, and maintenance of production data, ensuring its accuracy and availability for analysis. Monitor data pipelines and production systems to ensure smooth operation and quickly address any issues that arise. Implement and maintain data quality standards, conducting regular checks to ensure data integrity. Identify and resolve technical issues related to data processing and production systems. Work closely with data engineers, analysts, and other stakeholders to optimize data workflows and improve production efficiency. Contribute to continuous improvement initiatives by analyzing data to identify areas for process optimization Role 3 : ETL Support Engineer 6+ years of experience with ETL support and development ETL Tools: Experience with popular ETL tools like Talend, Microsoft SSIS, Experience with relational databases (e.g., SQL Server, Postgres). Experience with Snowflake Dataware house. Proficiency in writing complex SQL queries for data validation, comparison, and manipulation Familiarity with version control systems like Git, Github to manage changes in test cases and scripts. Knowledge of defect tracking tools like JIRA, ServiceNow. Banking domain experience is a must. Understanding of the ETL process Perform functional, Integration and Regression testing for ETL Processes. Validate and ensure data quality and consistency across different data sources and targets. Develop and execute test cases for ETL workflows and data pipeline. Load Testing: Ensuring that the data warehouse can handle the volume of data being loaded and queried under normal and peak conditions. Scalability: Testing for the scalability of the data warehouse in terms of data growth and system performance. Role 4 : Senior Data Modeler 7+ experience in metadata management, data modelling, and related tools (Erwin or ER Studio or others). Overall 10+ Experience in IT. Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional data platform technologies, and ETL and data ingestion). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Communication, and presentation skills. Help team to Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies develop the conceptual/logical/physical data models Define and govern data modelling and design standards, tools, best practices, and related development for enterprise data models. Hands-on modelling in modelling and mappings between source system data model and Datawarehouse data models. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks with respect to modelling and mappings. Hands on experience in writing complex SQL queries. Good to have experience in data modelling for NOSQL objects Role 5 : Data Engineer Data Bricks Design and build data pipelines using Spark-SQL and PySpark in Azure Databricks Design and build ETL pipelines using ADF Build and maintain a Lakehouse architecture in ADLS / Databricks. Perform data preparation tasks including data cleaning, normalization, deduplication, type conversion etc. Work with DevOps team to deploy solutions in production environments. Control data processes and take corrective action when errors are identified. Corrective action may include executing a work around process and then identifying the cause and solution for data errors. Participate as a full member of the global Analytics team, providing solutions for and insights into data related items. Collaborate with your Data Science and Business Intelligence colleagues across the world to share key learnings, leverage ideas and solutions and to propagate best practices. You will lead projects that include other team members and participate in projects led by other team members. Apply change management tools including training, communication and documentation to manage upgrades, changes and data migrations.

Posted 1 month ago

Apply

12.0 - 16.0 years

18 - 25 Lacs

Hyderabad

Remote

JD for Fullstack Developer. Exp: 10+yrs Front End Development Design and implement intuitive and responsive user interfaces using React.js or similar front-end technologies. Collaborate with stakeholders to create a seamless user experience. Create mockups and UI prototypes for quick turnaround using Figma, Canva, or similar tools. Strong proficiency in HTML, CSS, JavaScript, and React.js. Experience with styling and graph libraries such as Highcharts, Material UI, and Tailwind CSS. Solid understanding of React fundamentals, including Routing, Virtual DOM, and Higher-Order Components (HOC). Knowledge of REST API integration. Understanding of Node.js is a big advantage. Middleware Development Experience with REST API development, preferably using FastAPI. Proficiency in programming languages like Python. Integrate APIs and services between front-end and back-end systems. Experience with Docker and containerized applications. Back End Development Experience with orchestration tools such as Apache Airflow or similar. Design, develop, and manage simple data pipelines using Databricks, PySpark, and Google BigQuery. Medium-level expertise in SQL. Basic understanding of authentication methods such as JWT and OAuth. Bonus Skills Experience working with cloud platforms such as AWS, GCP, or Azure. Familiarity with Google BigQuery and Google APIs. Hands-on experience with Kubernetes for container orchestration. Contact : Sandeep Nunna Ph No : 9493883212 Email : sandeep.nunna@clonutsolutions,com

Posted 1 month ago

Apply

3.0 - 13.0 years

3 - 6 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

We are looking for a skilled .Net Core Developer to join our team. The ideal candidate should have strong expertise in .Net Core, Azure, and Investran , along with experience in building scalable and high-performance applications. Key Responsibilities: Develop, maintain, and optimize applications using .Net Core and related technologies. Work with Azure Cloud and Azure Data Bricks to build cloud-based solutions. Implement and manage solutions using Investran . Develop applications using VB.NET, Entity Framework, LINQ, and .NET MVC . Apply design patterns to create efficient and maintainable code. Collaborate with cross-functional teams to design, develop, and deploy software solutions. Troubleshoot and debug applications to ensure optimal performance. Required Skills & Qualifications: Proficiency in .Net Core and .Net MVC . Experience with Azure Cloud services and Azure Data Bricks . Hands-on experience with Investran . Strong knowledge of VB.NET, Entity Framework, and LINQ . Understanding of design patterns and best coding practices. Experience in developing and optimizing high-performance applications. Excellent problem-solving and communication skills. Preferred Qualifications: Experience with cloud-based deployments and containerization . Familiarity with Agile development methodologies. If you are passionate about building innovative software solutions and want to be part of a dynamic team, we'd love to hear from you!

Posted 1 month ago

Apply

12.0 - 17.0 years

25 - 35 Lacs

Pune, Chennai

Hybrid

• Should have led at least 3 large legacy EDW/data platform modernization & migrations to snowflake/databricks/data on cloud engagements in the last 5+ years. • Having experience in leading all aspects of the project/program life cycle, including

Posted 1 month ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Cloud Architect to design and oversee scalable, secure, and cost-efficient cloud solutions. Great for architects who bridge technical vision with business needs. Key Responsibilities: Design cloud-native solutions using AWS, Azure, or GCP Lead cloud migration and transformation projects Define cloud governance, cost control, and security strategies Collaborate with DevOps and engineering teams for implementation Required Skills & Qualifications: Deep expertise in cloud architecture and multi-cloud environments Experience with containers, serverless, and microservices Proficiency in Terraform, CloudFormation, or equivalent Bonus: Cloud certification (AWS/Azure/GCP Architect) Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 1 month ago

Apply

3.0 - 6.0 years

12 - 22 Lacs

Gurugram

Work from Office

Overview 170+ Years Strong. Industry Leader. Global Impact. At Pinkerton, the mission is to protect our clients. To do this, we provide enterprise risk management services and programs specifically designed for each client. Pinkerton employees are one of our most important assets and critical to the delivery of world-class solutions. Bonded together, we share a commitment to integrity, vigilance, and excellence. Pinkerton is an inclusive employer who seeks candidates with diverse backgrounds, experiences, and perspectives to join our family of industry subject matter experts. The Data Engineer will be part of a high-performing and international team with the goal to expand Data & Analytics solutions for our CRM application which is live in all Securitas countries. Together with the dedicated Frontend– & BI Developer you will be responsible for managing and maintaining the Databricks based BI Platform including the processes from data model changes, implementation and development of pipelines are part of the daily focus, but ETL will get most of your attention. Continuous development to do better will need the ability to think bigger and work closely with the whole team. The Data Engineer (ETL Specialist) will collaborate with the Frontend– & BI Developer to align on possibilities to improve the BI Platform deliverables specifically for the CEP organization. Cooperation with other departments such as integrations or specific IT/IS projects and business specialists is part of the job. The expectation is to always take data privacy into consideration when talking about moving data or sharing data with others. For that purpose, there is a need to develop the security layer as agreed with the legal department. Responsibilities Represent Pinkerton’s core values of integrity, vigilance, and excellence. Maintain & Develop the Databricks workspace used to host the BI CEP solution Active in advising needed changes the data model to accommodate new BI requirements Develop and implement new ETL scripts and improve the current ones Ownership on resolving the incoming tickets for both incidents and requests Plan activities to stay close to the Frontend- & BI Developer to foresee coming changes to the backend Through working with different team members improve the teamwork using the DevOps tool to keep track of the status of the deliverable from start to end Ensure understanding and visible implementation of the company’s core values of Integrity, Vigilance and Helpfulness. Knowledge about skills and experience available and required in your area today and tomorrow to drive liaison with other departments if needed. All other duties, as assigned. Qualifications At least 3+ years of experience in Data Engineering Understanding of designing and implementing data processing architectures in Azure environments Experience with different SSAS - modelling techniques (preferable Azure, databricks - Microsoft related) Understanding of data management and – treatment to secure data governance & security (Platform management and administration) An analytical mindset with clear communication and problem-solving skills Experience in working with SCRUM set up Fluent in English both spoken and written Bonus: knowledge of additional language(s) Ability to communicate, present and influence credibly at all levels both internally and externally Business Acumen & Commercial Awareness Working Conditions: With or without reasonable accommodation, requires the physical and mental capacity to effectively perform all essential functions; Regular computer usage. Occasional reaching and lifting of small objects and operating office equipment. Frequent sitting, standing, and/or walking. Travel, as required. Pinkerton is an equal opportunity employer to all applicants and positions without regard to race/ethnicity, color, national origin, ancestry, sex/gender, gender identity/expression, sexual orientation, marital/prenatal status, pregnancy/childbirth or related conditions, religion, creed, age, disability, genetic information, veteran status, or any protected status by local, state, federal or country-specific law.

Posted 2 months ago

Apply

5.0 - 9.0 years

25 - 40 Lacs

Pune

Work from Office

Position Summary: As a member of Redaptive's AI team, you will be driving Agentic AI and Generative AI integration across all of Redaptives business units. You will drive AI development and integration across the organization, directly impacting Redaptives global sustainability efforts and shaping how we leverage AI to serve Fortune 500 clients. Responsibilities and Duties: Strategic Leadership (10%): Champion the AI/ML roadmap, driving strategic planning and execution for all initiatives. Provide guidance on data science projects (Agentic AI, Generative AI, and Machine Learning), aligning them with business objectives and best practices. Foster a data-driven culture, advocating for AI-powered solutions to business challenges and efficiency improvements. Collaborate with product management, engineering, and business stakeholders to identify opportunities and deliver impactful solutions Technical Leadership (40%): Architect and develop Proof-of-Concept (POC) solutions for Agentic AI, Generative AI, and ML. Utilize Python and relevant data science libraries, leveraging MLflow. Provide technical guidance on AI projects, ensuring alignment with business objectives and best practices. Assist in developmentand documentation of standards for ethical and regulatory-compliant AI usage. Stay current with AI advancements, contributing to the team's knowledge and expertise. Perform hands-on data wrangling and AI model development Operational Leadership (50%): Drive continuous improvement through Agentic AI, Generative AI, and predictive modeling. Participate in Agile development processes (Scrum and Kanban). Ensure compliance with regulatory and ethical AI standards. Other duties as assigned Required Abilities and Skills: Agentic AI development and deployment. Statistical modeling, machine learning algorithms, and data mining techniques. Databricks and MLflow for model training, deployment, and management on AWS. Working with large datasets on AWS and Databricks Strong hands-on experience with: Agentic AI development and deployment. Working with large datasets on AWS and Databricks. Desired Experience: Statistical modeling, machine learning algorithms, and data mining techniques. Databricks and MLflow for model training, deployment, and management on AWS. Experience integrating AI with IoT/event data. Experience with real-time and batch inference integration with SaaS applications. International team management experience. Track record of successful product launches in regulated environments Education and Experience: 5+ years of data science/AI experience Bachelor's degree in Statistics, Data Science, Computer Engineering, Mathematics, or a related field (Master's preferred). Proven track record of deploying successful Agentic AI, Generative AI, and ML projects from concept to production. Excellent communication skills, able to explain complex technical concepts to both technical and non-technical audiences.

Posted 2 months ago

Apply

8.0 - 10.0 years

10 - 14 Lacs

Pune

Work from Office

Salary 20 - 28 LPA About The Role - Mandatory Skills AWS Architect, AWS Glue or Databricks, PySpark, and Python - Hands-on experience with AWS Glue or Databricks, PySpark, and Python. - Minimum of 2 years of hands-on expertise in PySpark, including Spark job performance optimization techniques. - Minimum of 2 years of hands-on involvement with AWS Cloud - Hands on experience in StepFunction, Lambda, S3, Secret Manager, Snowflake/Redshift, RDS, Cloudwatch - Proficiency in crafting low-level designs for data warehousing solutions on AWS cloud. - Proven track record of implementing big-data solutions within the AWS ecosystem including Data Lakes. - Familiarity with data warehousing, data quality assurance, and monitoring practices. - Demonstrated capability in constructing scalable data pipelines and ETL processes. - Proficiency in testing methodologies and validating data pipelines. - Experience with or working knowledge of DevOps environments. - Practical experience in Data security services. - Understanding of data modeling, integration, and design principles. - Strong communication and analytical skills. - A dedicated team player with a goal-oriented mindset, committed to delivering quality work with attention to detail. - Solution Design Collaborate with clients and stakeholders to understand business requirements and translate them into cloud-based solutions utilizing AWS services (EC2, Lambda, S3, RDS, VPC, IAM, etc.). - Architecture and Implementation Design and implement secure, scalable, and high-performance cloud solutions, ensuring alignment with AWS best practices and architectural principles. - Cloud Migration Assist with the migration of on-premise applications to AWS, ensuring minimal disruption and maximum efficiency. - Technical Leadership Provide technical leadership and guidance to development teams to ensure adherence to architecture standards and best practices. - Optimization Continuously evaluate and optimize AWS environments for cost, performance, and security. - Security Ensure the cloud architecture adheres to industry standards and security policies, using tools like AWS Identity and Access Management (IAM), AWS Key Management Service (KMS), and encryption protocols. - Documentation & Reporting Create clear technical documentation to define architectural decisions, solution designs, and cloud configurations. - Stakeholder Collaboration Work with cross-functional teams including developers, DevOps, QA, and business teams to align technical solutions with business goals. - Continuous Learning Stay updated with the latest AWS services, tools, and industry trends to ensure the implementation of cutting-edge solutions. - Strong understanding of AWS cloud services and architecture. - Hands-on experience with Infrastructure as Code (IaC) tools like AWS CloudFormation, Terraform, or AWS CDK. - Knowledge of networking, security, and database services within AWS (e.g., VPC, IAM, RDS, and S3). - Familiarity with containerization and orchestration using AWS services like ECS, EKS, or Fargate. - Proficiency in scripting languages (e.g., Python, Shell, or Node.js). - Familiarity with CI/CD tools and practices in AWS environments (e.g., CodePipeline, Jenkins, etc.). Soft Skills : Communication Skills : - Clear and Concise Communication Ability to articulate complex technical concepts in simple terms for both technical and non-technical stakeholders. - Active Listening Ability to listen to business and technical requirements from stakeholders to ensure the proposed solution meets their needs. - Documentation Skills Ability to document technical designs, solutions, and architectural decisions in a clear and well-organized manner. Leadership and Team Collaboration : - Mentoring and Coaching Ability to mentor junior engineers, providing guidance and fostering professional growth. - Cross-functional Teamwork Collaborating effectively with various teams such as developers, DevOps, QA, business analysts, and security specialists to deliver integrated cloud solutions. - Conflict Resolution Addressing and resolving conflicts within teams and stakeholders to ensure smooth project execution. Problem-Solving and Critical Thinking : - Analytical Thinking Ability to break down complex problems and develop logical, scalable, and cost-effective solutions. - Creative Thinking Think outside the box to design innovative solutions that maximize the value of AWS technologies. - Troubleshooting Skills Quickly identifying root causes of issues and finding solutions to mitigate them. Adaptability and Flexibility : - Handling Change Ability to adapt to evolving requirements, technologies, and business needs. Cloud technologies and customer requirements change quickly. - Resilience Ability to deal with challenges and setbacks while maintaining a positive attitude and focus on delivering results. Stakeholder Management : - Client-facing Skills Ability to manage client relationships, understand their business needs, and translate those needs into cloud solutions. - Negotiation Skills Negotiating technical aspects of projects with clients or business units to balance scope, resources, and timelines. - Expectation Management Ability to set and manage expectations regarding timelines, deliverables, and technical feasibility. Decision-Making : - Sound Judgment Making well-informed and balanced decisions that consider both technical feasibility and business impact. - Risk Management Ability to assess risks in terms of cost, security, and performance and make decisions that minimize potential issues Preferred Skills : - Familiarity with DevOps practices and tools (e.g., Jenkins, Docker, Kubernetes). - Experience with serverless architectures using AWS Lambda, API Gateway, and DynamoDB. - Exposure to multi-cloud architectures (AWS, Azure, Google Cloud). Why Join Us - Competitive salary and benefits. - Opportunity to work on cutting-edge cloud technologies. - A dynamic work environment where innovation is encouraged. - Strong focus on professional development and career growth. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Azure Data Factory: - Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure Synapse Analytics - Query data in Azure Synapse Analytics Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

8.0 - 12.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Job Position Python Lead Total Exp Required 6+ years Relevant Exp Required around 5 Mandatory skills required Strong Python coding and development Good to have skills required Cloud, SQL , data analysis skills Location Pune - Kharadi - WFO - 3 days/week. About The Role : We are seeking a highly skilled and experienced Python Lead to join our team. The ideal candidate will have strong expertise in Python coding and development, along with good-to-have skills in cloud technologies, SQL, and data analysis. Key Responsibilities : - Lead the development of high-quality, scalable, and robust Python applications. - Collaborate with cross-functional teams to define, design, and ship new features. - Ensure the performance, quality, and responsiveness of applications. - Develop RESTful applications using frameworks like Flask, Django, or FastAPI. - Utilize Databricks, PySpark SQL, and strong data analysis skills to drive data solutions. - Implement and manage modern data solutions using Azure Data Factory, Data Lake, and Data Bricks. Mandatory Skills : - Proven experience with cloud platforms (e.g. AWS) - Strong proficiency in Python, PySpark, R, and familiarity with additional programming languages such as C++, Rust, or Java. - Expertise in designing ETL architectures for batch and streaming processes, database technologies (OLTP/OLAP), and SQL. - Experience with the Apache Spark, and multi-cloud platforms (AWS, GCP, Azure). - Knowledge of data governance and GxP data contexts; familiarity with the Pharma value chain is a plus. Good to Have Skills : - Experience with modern data solutions via Azure. - Knowledge of principles summarized in the Microsoft Cloud Adoption Framework. - Additional expertise in SQL and data analysis. Educational Qualifications Bachelor's/Master's degree or equivalent with a focus on software engineering. If you are a passionate Python developer with a knack for cloud technologies and data analysis, we would love to hear from you. Join us in driving innovation and building cutting-edge solutions! Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

7.0 - 10.0 years

1 - 5 Lacs

Pune

Work from Office

Responsibilities : - Design, develop, and deploy data pipelines using Databricks, including data ingestion, transformation, and loading (ETL) processes. - Develop and maintain high-quality, scalable, and maintainable Databricks notebooks using Python. - Work with Delta Lake and other advanced features. - Leverage Unity Catalog for data governance, access control, and data discovery. - Develop and optimize data pipelines for performance and cost-effectiveness. - Integrate with various data sources, including but not limited to databases and cloud storage (Azure Blob Storage, ADLS, Synapse), and APIs. - Experience working with Parquet files for data storage and processing. - Experience with data integration from Azure Data Factory, Azure Data Lake, and other relevant Azure services. - Perform data quality checks and validation to ensure data accuracy and integrity. - Troubleshoot and resolve data pipeline issues effectively. - Collaborate with data analysts, business analysts, and business stakeholders to understand their data needs and translate them into technical solutions. - Participate in code reviews and contribute to best practices within the team. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Hyderabad

Work from Office

- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

9.0 - 13.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Experience 8+ years Location Knowledge City, Hyderabad Work Model : Hybrid Regular work hours No. of rounds 1 internal technical round & client 2 rounds About You : The GCP CloudOps Engineer is accountable for a continuous, repeatable, secure, and automated deployment, integration, and test solutions utilizing Infrastructure as Code (IaC) and DevSecOps techniques. - 8+ years of hands-on experience in infrastructure design, implementation, and delivery - 3+ years of hands-on experience with monitoring tools (Datadog, New Relic, or Splunk) - 4+ years of hands-on experience with Container orchestration services, including Docker or Kubernetes, GKE. - Experience with working across time zones and with different cultures. - 5+ years of hands-on experience in Cloud technologies GCP is preferred. - Maintain an outstanding level of documentation, including principles, standards, practices, and project plans. - Having experience building a data warehouse using Databricks is a huge plus. - Hands-on experience with IaC patterns and practices and related automation tools such as Terraform, Jenkins, Spinnaker, CircleCI, etc., built automation and tools using Python, Go, Java, or Ruby. - Deep knowledge of CICD processes, tools, and platforms like GitHub workflows and Azure DevOps. - Proactive collaborator and can work in cross-team initiatives with excellent written and verbal communication skills. - Experience with automating long-term solutions to problems rather than applying a quick fix. - Extensive knowledge of improving platform observability and implementing optimizations to monitoring and alerting tools. - Experience measuring and modeling cost and performance metrics of cloud services and establishing a vision backed by data. - Develop tools and CI/CD framework to make it easier for teams to build, configure, and deploy applications - Contribute to Cloud strategy discussions and decisions on overall Cloud design and best approach for implementing Cloud solutions - Follow and Develop standards and procedures for all aspects of a Digital Platform in the Cloud - Identify system enhancements and automation opportunities for installing/maintaining digital platforms - Adhere to best practices on Incident, Problem, and Change management - Implementing automated procedures to handle issues and alerts proactively - Experience with debugging applications and a deep understanding of deployment architectures. Pluses : - Databricks - Experience with the Multicloud environment (GCP, AWS, Azure), GCP is the preferred cloud provider. - Experience with GitHub and GitHub Actions Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

6.0 - 10.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Job Role Big Data Engineer Work Location Bangalore (CV Ramen Nagar location) Experience 7+ Years Notice Period Immediate - 30 days Mandatory Skills Big Data, Python, SQL, Spark/Pyspark, AWS Cloud JD and required Skills & Responsibilities - Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support. - Solve complex business problems by utilizing a disciplined development methodology. - Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies. - Analyse the source and target system data. Map the transformation that meets the requirements. - Interact with the client and onsite coordinators during different phases of a project. - Design and implement product features in collaboration with business and Technology stakeholders. - Anticipate, identify, and solve issues concerning data management to improve data quality. - Clean, prepare, and optimize data at scale for ingestion and consumption. - Support the implementation of new data management projects and re-structure the current data architecture. - Implement automated workflows and routines using workflow scheduling tools. - Understand and use continuous integration, test-driven development, and production deployment frameworks. - Participate in design, code, test plans, and dataset implementation performed by other data engineers in support of maintaining data engineering standards. - Analyze and profile data for the purpose of designing scalable solutions. - Troubleshoot straightforward data issues and perform root cause analysis to proactively resolve product issues. Required Skills - 5+ years of relevant experience developing Data and analytic solutions. - Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive & PySpark - Experience with relational SQL. - Experience with scripting languages such as Python. - Experience with source control tools such as GitHub and related dev process. - Experience with workflow scheduling tools such as Airflow. - In-depth knowledge of AWS Cloud (S3, EMR, Databricks) - Has a passion for data solutions. - Has a strong problem-solving and analytical mindset - Working experience in the design, Development, and test of data pipelines. - Experience working with Agile Teams. - Able to influence and communicate effectively, both verbally and in writing, with team members and business stakeholders - Able to quickly pick up new programming languages, technologies, and frameworks. - Bachelor's degree in computer science Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

4.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

About The Role : - Minimum 4 years of experience in relevant field. - Hands on experience in Databricks, SQL, Azure Data Factory, Azure DevOps - Strong expertise in Microsoft Azure cloud platform services (Azure Data Factory, Azure Data Bricks, Azure SQL Database, Azure Data Lake Storage, Azure Synapse Analytics). - Proficient in CI-CD pipelines in Azure DevOps for automatic deployments - Good in Performance optimization techniques like using temp tables, CTE, indexing, merge statements, joins. - Familiarity in Advanced SQL and programming skills (e.g., Python, Pyspark). - Familiarity with data warehousing and data modelling concepts. - Good in Data management and deployment processes using Azure Data factory and Databricks, Azure DevOps. - Knowledge on integrating every azure service with DevOps - Experience in designing and implementing scalable data architectures. - Proficient in ETL processes and tools. - Strong communication and collaboration skills. - Certifications in relevant Azure technologies are a plus Location Bangalore/ Hyderabad Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

3.0 - 7.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Who We Are Applied Materials is the global leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. We design, build and service cutting-edge equipment that helps our customers manufacture display and semiconductor chips- the brains of devices we use every day. As the foundation of the global electronics industry, Applied enables the exciting technologies that literally connect our world- like AI and IoT. If you want to work beyond the cutting-edge, continuously pushing the boundaries of"science and engineering to make possible"the next generations of technology, join us to Make Possible® a Better Future. What We Offer Location: Bangalore,IND At Applied, we prioritize the well-being of you and your family and encourage you to bring your best self to work. Your happiness, health, and resiliency are at the core of our benefits and wellness programs. Our robust total rewards package makes it easier to take care of your whole self and your whole family. Were committed to providing programs and support that encourage personal and professional growth and care for you at work, at home, or wherever you may go. Learn more about our benefits . Youll also benefit from a supportive work culture that encourages you to learn, develop and grow your career as you take on challenges and drive innovative solutions for our customers."We empower our team to push the boundaries of what is possible"”while learning every day in a supportive leading global company. Visit our Careers website to learn more about careers at Applied. Key Responsibilities Provide technical support for applications built using .Net as well as Angular, React and other open source technologies. Troubleshoot and resolve issues related to Front End, APIs and backend services. Collaborate with development teams to understand and resolve technical issues Assist in the deployment and maintenance of software applications. Ensure the performance, quality, and responsiveness of applications and apply permanent fixes to the critical and recurring issues Help maintain code quality, organization, and automation. Perform design reviews with the respective development for critical applications and provide inputs Document support processes and solutions for future reference. Stay up-to-date with the latest industry trends and technologies. Required Skills and Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. 8+ years of experience in software development and support. Strong proficiency in .Net, Angular, React, Proficient in Python for backend support Familiarity in Hadoop Ecosystem as well as Databricks Experience with RESTful APIs and web services. Solid understanding of front-end technologies, including HTML5, CSS3, and JavaScript as well as Azure, AWS Strong Background in SQL Server and other relational databases Familiarity with version control systems (e.g., Git) as well as Atlassian Products for Software Development and Code Deployment Mechanisms/DevOps Best practices in hosting the applications in containerized platforms like OCP (onprem and cloud) etc Experience with open-source projects and contributions. Strong problem-solving skills and attention to detail. Excellent communication and teamwork skills. Certifications in relevant areas specially Microsoft will be a plus Functional Knowledge Demonstrates conceptual and practical expertise in own discipline and knowledge of Semiconductor industry is nice to have interpersonal Skills Explains difficult or sensitive information; works to build consensus Additional Information Time Type: Full time Employee Type: Assignee / Regular Travel: Yes, 10% of the Time Relocation Eligible: Yes Applied Materials is an Equal Opportunity Employer. Qualified applicants will receive consideration for employment without regard to race, color, national origin, citizenship, ancestry, religion, creed, sex, sexual orientation, gender identity, age, disability, veteran or military status, or any other basis prohibited by law.

Posted 2 months ago

Apply

3.0 - 6.0 years

0 Lacs

, India

On-site

About the Role: 09 Job Description: We are seeking a skilled and motivated Application Operations Engineer for an SRE role with Java, React JS and Spring boot skillset along with expertise in Data Bricks, particularly with Oracle integration, to join our dynamic SRE team. The ideal candidate should have 3 to 6 years of experience in supporting robust web applications using Java, React JS and Spring boot with a strong background in managing and optimizing data workflows leveraging Oracle databases. The incumbent will be responsible for supporting applications, troubleshooting issues, providing RCA's and suggestive fixes by managing continuous integration and deployment pipelines, automating processes, and ensuring systems reliability, maintainability and stability. Responsibilities: The incumbent will be working in CI/CD, handle Infrastructure issues, know how on supporting Operations and maintain user-facing features using React JS, Spring boot & Java Has ability to support reusable components and front-end libraries for future use Partner with development teams to improve services through rigorous testing and release procedures. Has willingness to learn new tools and technologies as per the project demand. Ensure the technical feasibility of UI/UX designs Optimize applications for maximum speed and scalability Collaborate with other team members and stakeholders Work closely with data engineers to ensure smooth data flow and integration. Create and maintain documentation for data processes and workflows. Troubleshoot and resolve issues related to data integrity and performance. Good to have working knowledge on Tomcat App server and Apache web server, Oracle, Postgres Command on Linux & Unix. Self-driven individual Requirements : Bachelor's degree in computer science engineering, or a related field 3-6 years of professional experience Proficiency in Advanced Java, JavaScript, including DOM manipulation and the JavaScript object model Experience with popular React JS workflows (such as Redux, MobX, Flux) Familiarity with RESTful APIs Experience with cloud platforms such as AWS and Azure Knowledge of CI/CD pipelines and DevOps practices Experience with data engineering tools and technologies, particularly Data Bricks Proficiency in Oracle database technologies and SQL queries Excellent problem-solving skills and attention to detail Ability to work independently and as part of a team Good verbal and written communication skills Familiarity with ITSM processes like Incident, Problem and Change Management using ServiceNow (preferable) Ability to work in shift manner. Grade - 09 Location - Hyderabad Hybrid Mode - twice a week work from office Shift Time - 6:30 am to 1 pm OR 2 pm to 10 pm IST S&P Global Ratings is a division of S&P Global (NYSE: SPGI). S&P Global is the world's foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world's leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit What's In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology-the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide-so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We're committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We're constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That's why we provide everything you-and your career-need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It's not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards-small perks can make a big difference. For more information on benefits by country visit: Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. S&P Global has a Securities Disclosure and Trading Policy (the Policy) that seeks to mitigate conflicts of interest by monitoring and placing restrictions on personal securities holding and trading. The Policy is designed to promote compliance with global regulations. In some Divisions, pursuant to the Policy's requirements, candidates at S&P Global may be asked to disclose securities holdings. Some roles may include a trading prohibition and remediation of positions when there is an effective or potential conflict of interest. Employment at S&P Global is contingent upon compliance with the Policy. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group)

Posted 2 months ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Gurugram

Work from Office

Description Data Analyst II Syneos Healthis a leading fully integrated biopharmaceutical solutions organization built to accelerate customer success We translate unique clinical, medical affairs and commercial insights into outcomes to address modern market realities, Every day we perform better because of how we work together, as one team, each the best at what we do We bring a wide range of talented experts together across a wide range of business-critical services that support our business Every role within Corporate is vital to furthering our vision of Shortening the Distance from Lab to Life, JOB SUMMARYThe Data Analyst II supports our business goals by analyzing datasets and providing recommendations to maximize efficiency & effectiveness for our project teams and customers This role also plays a key part in tracking product approvals, measuring customer satisfaction, supporting business development efforts, and maintaining key performance data to drive strategic decision-making, Job Responsibilities Work independently to solve open-ended questions, Design and analyze tests and experiments, Maintain documentation of analytical processes and projects, Build, maintain, and improve performance dashboards leveraging customer feedback for use and accessibility, Advise clients on relevant best practices and ensure the data is easily retrievable for their review, Support data quality and understanding customer needs as they evolve, Mentor and coach junior team members, Support site advocacy group meetings by inviting PIs, discussing blinded protocols, collecting feedback, and managing scheduling, hosting, and meeting minutes, Develop and manage capabilities decks twice annually, along with bespoke slides and marketing information sheets using Power BI data, Track and analyze business development outcomes through opportunity trackers, monitoring RFP success rates, regulatory approvals, and win rates, Monitor customer satisfaction by reviewing feedback from the EM team and facilitating monthly cross-time zone communications, Oversee product approval tracking, ensuring visibility into product lifecycle status and final approval outcomes, Qualifications QUALIFICATION REQUIREMENTS Bachelors Degree in a related field such as Computer Science or Statistics, Strong data manipulation skills: querying and manipulating data with SQL; advanced MS Excel skills (VLOOKUP, functions, dashboards, Power Pivot) and knowledge of Python or R, Experience with A/B conversion testing, Cross-functional collaboration experience with IT and data engineering teams to ensure the infrastructure supports scalable, efficient data analysis, Concise and clear written and oral communication Proven experience with delivering insights and reports with PowerPoint slides to customers Strong attention to detail, Experience building dashboards in PowerBI or Tableau, Preferred Knowledge of clinical decision support systems and healthcare operational workflows, Familiarity with cloud platforms (Azure, AWS) for data storage and analysis, Experience working with Databricks, Apache Spark, and ETL pipelines for large-scale data processing and analytics, Get to know Syneos Health Over the past 5 years, we have worked with 94% of all Novel FDA Approved Drugs, 95% of EMA Authorized Products and over 200 Studies across 73,000 Sites and 675,000+ Trial patients, No matter what your role is, youll take the initiative and challenge the status quo with us in a highly competitive and ever-changing environment Learn more about Syneos Health

Posted 2 months ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Dreaming big is in our DNA Its who we are as a company Its our culture Its our heritage And more than ever, its our future A future where were always looking forward Always serving up new ways to meet lifes moments A future where we keep dreaming bigger We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential The power we create together when we combine your strengths with ours is unstoppable Are you ready to join a team that dreams as big as you do AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology The teams are transforming Operations through Tech and Analytics, Do You Dream Big We Need You, Job Description Job Title: Senior ML Engineer Location: Bangalore Reporting to: Director Data Analytics Purpose of the role Anheuser-Busch InBev (AB InBev)s Supply Analytics is responsible for building competitive differentiated solutions that enhance brewery efficiency through data-driven insights We optimize processes, reduce waste, and improve productivity by leveraging advanced analytics and AI-driven solutions, Senior MLE, will be responsible for the end-to-end deployment of machine learning models on edge devices You will take ownership of all aspects of edge deployment, including model optimization, scaling complexities, containerization, and infrastructure management, ensuring high availability and performance, Key tasks & accountabilities Lead the entire edge deployment lifecycle, from model training to deployment and monitoring on edge devices Develop, and maintain a scalable Edge ML pipeline that enables real-time analytics at brewery sites, Optimize and containerize models using Portainer, Docker, and Azure Container Registry (ACR) to ensure efficient execution in constrained edge environments, Own and manage the GitHub repository, ensuring structured, well-documented, and modularized code for seamless deployments, Establish robust CI/CD pipelines for continuous integration and deployment of models and services, Implement logging, monitoring, and alerting for deployed models to ensure reliability and quick failure recovery Ensure compliance with security and governance best practices for data and model deployment in edge environments, Document the thought process & create artifacts on team repo/wiki that can be used to share with business & engineering for sign off, Review code quality and design developed by the peers, Significantly improve the performance & reliability of our code that creates high quality & reproducible results, Develop internal tools/utils that improve productivity of entire team, Collaborate with other team members to advance the teams ability to ship high quality code fast! Mentor/coach junior team members to continuously upskill them, Maintain basic developer hygiene that includes but not limited to, writing tests, using loggers, readme to name a few, Qualifications, Experience, Skills Level of educational attainment required (1 or more of the following) Academic degree in, but not limited to, Bachelors or master's in computer application, Computer science, or any engineering discipline, Previous Work Experience 5+ years of real-world experience to develop scalable & high-quality ML models, Strong problem-solving skills with an owners mindset?proactively identifying and resolving bottlenecks, Technical Skills Required Proficiency with pandas, NumPy, SciPy, scikit-learn, stats models, TensorFlow, Good understanding of statistical computing, parallel processing, Experience with advanced TensorFlow distributed, NumPy, joblib, Good understanding of memory management & parallel processing in python, Profiling & optimization of production code, Strong at Python coding Exposure to working in IDEs such as VSC or PyCharm, Experience in code versioning using Git, maintaining modularized code base for multiple deployments, Experience in working in an Agile environment, In depth understand of data bricks (Workflows, cluster creation, repo management), In depth understanding of machine learning solution in Azure cloud, Best practices in coding standards, unit testing, and automation, Proficiency in Docker, Kubernetes, Portainer, and container orchestration for edge computing, Other Skills Required Experience in real-time analytics and edge AI deployments Exposure to DevOps practices, including infrastructure automation and monitoring tools Contributions to OSS or Stack overflow, And above all of this, an undying love for beer! We dream big to create future with more cheers

Posted 2 months ago

Apply

10.0 - 15.0 years

30 - 45 Lacs

Pune

Work from Office

Azure Cloud Data Solutions Architect Job Title: Azure Cloud Data Solutions Architect Location: Pune, India Experience: 10 - 15 Years Work Mode: Full-time, Office-based Company : Smartavya Analytica Private Limited Company Overview: Smartavya Analytica is a niche Data and AI company based in Mumbai, established in 2017. We specialize in data-driven innovation, transforming enterprise data into strategic insights. With expertise spanning over 25+ Data Modernization projects and handling large datasets up to 24 PB in a single implementation, we have successfully delivered data and AI projects across multiple industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are specialists in Cloud, Hadoop, Big Data, AI, and Analytics, with a strong focus on Data Modernization for On-premises, Private, and Public Cloud Platforms. Visit us at: https://smart-analytica.com Job Summary: We are seeking an experienced Azure Cloud Data Solutions Architect to lead end-to-end architecture and delivery of enterprise-scale cloud data platforms. The ideal candidate will have deep expertise in Azure Data Services , Data Engineering , and Data Governance , with the ability to architect and guide cloud modernization initiatives. Key Responsibilities: Architect and design data lakehouses , data warehouses , and analytics platforms using Azure Data Services . Lead implementations using Azure Data Factory (ADF) , Azure Synapse Analytics , and Azure Fabric (OneLake ecosystem). Define and implement data governance frameworks including cataloguing, lineage, security, and quality controls. Collaborate with business stakeholders, data engineers, and developers to translate business requirements into scalable Azure architectures. Ensure platform design meets performance, scalability, security, and regulatory compliance needs. Guide migration of on-premises data platforms to Azure Cloud environments. Create architectural artifacts: solution blueprints, reference architectures, governance models, and best practice guidelines. Collaborate with Sales / presales to customer meetings to understand the business requirement, the scope of work and propose relevant solutions. Drive the MVP/PoC and capability demos to prospective customers / opportunities Must-Have Skills: 1015 years of experience in data architecture, data engineering, or analytics solutions. Hands-on expertise in Azure Cloud services: ADF , Synapse , Azure Fabric (OneLake) , and Databricks (good to have). Strong understanding of data governance , metadata management, and compliance frameworks (e.g., GDPR, HIPAA). Deep knowledge of relational and non-relational databases (SQL, NoSQL) on Azure. Experience with security practices (IAM, RBAC, encryption, data masking) in cloud environments. Strong client-facing skills with the ability to present complex solutions clearly. Preferred Certifications: Microsoft Certified: Azure Solutions Architect Expert Microsoft Certified: Azure Data Engineer Associate

Posted 2 months ago

Apply

1.0 - 3.0 years

11 - 15 Lacs

Mumbai

Work from Office

Overview The Data Technology team at MSCI is responsible for meeting the data requirements across various business areas, including Index, Analytics, and Sustainability. Our team collates data from multiple sources such as vendors (e.g., Bloomberg, Reuters), website acquisitions, and web scraping (e.g., financial news sites, company websites, exchange websites, filings). This data can be in structured or semi-structured formats. We normalize the data, perform quality checks, assign internal identifiers, and release it to downstream applications. Responsibilities As data engineers, we build scalable systems to process data in various formats and volumes, ranging from megabytes to terabytes. Our systems perform quality checks, match data across various sources, and release it in multiple formats. We leverage the latest technologies, sources, and tools to process the data. Some of the exciting technologies we work with include Snowflake, Databricks, and Apache Spark. Qualifications Core Java, Spring Boot, Apache Spark, Spring Batch, Python. Exposure to sql databases like Oracle, Mysql, Microsoft Sql is a must. Any experience/knowledge/certification on Cloud technology preferrably Microsoft Azure or Google cloud platform is good to have. Exposures to non sql databases like Neo4j or Document database is again good to have. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 2 months ago

Apply

6.0 - 11.0 years

19 - 27 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

We are looking for "Azure Data bricks Engineer" with Minimum 6 years experience Contact- Atchaya (95001 64554) Required Candidate profile Exp in Azure Data bricks and Python Must Have Data bricks Python Azure The Candidate must have 7-10 yrs of experience in data bricks, delta lake Hands-on exp on Azure Exp on Python scripting

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies