Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
As a skilled MLOps Support Engineer, you will be responsible for monitoring and managing ML model operational pipelines in AzureML and MLflow. Your primary focus will be on automation, integration validation, and CI/CD pipeline management to ensure stability and reliability in model deployment lifecycles. Your objectives in this role include supporting and monitoring MLOps pipelines in AzureML and MLflow, managing CI/CD pipelines for model deployment and updates, handling model registry processes, performing testing and validation of integrated endpoints, automating monitoring and upkeep of ML pipelines, as well as troubleshooting and resolving pipeline and integration-related issues. In your day-to-day responsibilities, you will support production ML pipelines using AzureML and MLflow, configure and manage model versioning and registry lifecycle, automate alerts, monitoring tasks, and routine pipeline operations, validate REST API endpoints for ML models, implement CI/CD workflows for ML deployments, document and troubleshoot operational issues related to ML services, and collaborate with data scientists and platform teams to ensure delivery continuity. To excel in this role, you should possess proficiency in AzureML, MLflow, and Databricks, have a strong command over Python, experience with Azure CLI and scripting, a good understanding of CI/CD practices in MLOps, knowledge of model registry management and deployment validation, and at least 3-5 years of relevant experience in MLOps environments. While not mandatory, it would be beneficial to have skills such as exposure to monitoring tools like Azure Monitor and Prometheus, experience with REST API testing tools such as Postman, and familiarity with Docker/Kubernetes in ML deployments.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As an AI/ML Specialist, you will be responsible for building intelligent systems utilizing OT sensor data and Azure ML tools. Your primary focus will be collaborating with data scientists, engineers, and operations teams to develop scalable AI solutions addressing critical manufacturing issues such as predictive maintenance, process optimization, and anomaly detection. This role involves bridging the edge and cloud environments by deploying AI solutions to run effectively on either cloud platforms or industrial edge devices. Your key functions will include designing and developing ML models using time-series sensor data from OT systems, working closely with engineering and data science teams to translate manufacturing challenges into AI use cases, implementing MLOps pipelines on Azure ML, and integrating with Databricks/Delta Lake. Additionally, you will be responsible for deploying and monitoring models at the edge using Azure IoT Edge, conducting model validation, retraining, and performance monitoring, as well as collaborating with plant operations to contextualize insights and integrate them into workflows. To qualify for this role, you should have a minimum of 5 years of experience in machine learning and AI. Hands-on experience with Azure ML, ML flow, Databricks, and PyTorch/TensorFlow is essential. You should also possess a proven ability to work with OT sensor data such as temperature, vibration, flow, etc. A strong background in time-series modeling, edge inferencing, and MLOps is required, along with familiarity with manufacturing KPIs and predictive modeling use cases.,
Posted 1 week ago
7.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As a Business Analyst with 7-14 years of experience, you will be responsible for various tasks including Business Requirement Documents (BRD) and Functional Requirement Documents (FRD) creation, Stakeholder Management, User Acceptance Testing (UAT), understanding Datawarehouse Concepts, SQL queries, and subqueries, as well as utilizing Data Visualization tools such as Power BI or MicroStrategy. It is essential that you have a deep understanding of the Investment Domain, specifically in areas like Capital markets, Asset management, and Wealth management. Your primary responsibilities will involve working closely with stakeholders to gather requirements, analyzing data, and testing systems to ensure they meet business needs. Additionally, you should have a strong background in investment management or financial services, with experience in areas like Asset management, Investment operations, and Insurance. Your familiarity with concepts like Critical Data Elements (CDEs), data traps, and reconciliation workflows will be beneficial in this role. Technical expertise in BI and analytics tools like Power BI, Tableau, and MicroStrategy is required, along with proficiency in SQL. You should also possess excellent communication skills, analytical thinking capabilities, and the ability to engage effectively with stakeholders. Experience in working within Agile/Scrum environments with cross-functional teams is highly valued. In terms of technical skills, you should demonstrate proven abilities in analytical problem-solving, with a deep knowledge of investment data platforms such as Golden Source, NeoXam, RIMES, and JPM Fusion. Expertise in cloud data technologies like Snowflake, Databricks, and AWS/GCP/Azure data services is essential. Understanding data governance frameworks, metadata management, and data lineage is crucial, along with compliance standards in the investment management industry. Hands-on experience with Investment Book of Records (IBORs) like Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart is preferred. Familiarity with investment data platforms including Golden Source, FINBOURNE, NeoXam, RIMES, and JPM Fusion, as well as cloud data platforms like Snowflake and Databricks, will be advantageous. Your background in data governance, metadata management, and data lineage frameworks will be essential in ensuring data accuracy and compliance within the organization.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You are invited to join InfoBeans Technologies as a Data Engineer with a minimum of 5 years of experience in the field. This is a full-time position and we are looking for individuals who are proficient in Snowflake, along with expertise in either Azure Data Factory (ADF) and Python or Power BI and Data Modeling. As a Data Engineer at InfoBeans Technologies, you will be required to have hands-on experience with tools such as WhereScape RED + 3D, DataVault 2.0, SQL, and data transformation pipelines. A strong understanding of Data Management & Analytics principles is essential for this role. Additionally, excellent communication skills and the ability to engage in requirements engineering are highly valued. The successful candidate will be responsible for delivering and supporting production-ready data systems at an expert level of proficiency. The primary skill areas required for this role include Data Engineering & Analytics. If you are passionate about building robust data pipelines, modeling enterprise data, and visualizing meaningful insights, we would love to connect with you. Immediate availability or joining within 15 days is preferred for this position. To apply for this exciting opportunity, please send your resume to mradul.khandelwal@infobeans.com or reach out to us directly. Join us in shaping the future of data analytics and engineering at InfoBeans Technologies.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As an integral part of our team at Proximity, you will be taking on the role of both a hands-on tech lead and product manager. Your primary responsibility will be to deliver data/ML platforms and pipelines within a Databricks-Azure environment. In this capacity, you will be leading a small delivery team and collaborating with enabling teams to drive product, architecture, and data science initiatives. Your ability to translate business requirements into product strategy and technical delivery with a platform-first mindset will be crucial to our success. To excel in this role, you should possess technical proficiency in Python, SQL, Databricks, Delta Lake, MLflow, Terraform, medallion architecture, data mesh/fabric, and Azure. Additionally, expertise in Agile delivery, discovery cycles, outcome-focused planning, and trunk-based development will be advantageous. You should also be adept at collaborating with engineers, working across cross-functional teams, and fostering self-service platforms. Clear communication skills will be key in articulating decisions, roadmap, and priorities effectively. Joining our team comes with a host of benefits. You will have the opportunity to engage in Proximity Talks, where you can interact with fellow designers, engineers, and product enthusiasts, and gain insights from industry experts. Working alongside our world-class team will provide you with continuous learning opportunities, allowing you to challenge yourself and acquire new knowledge on a daily basis. Proximity is a leading technology, design, and consulting partner for prominent Sports, Media, and Entertainment companies globally. With headquarters in San Francisco and additional offices in Palo Alto, Dubai, Mumbai, and Bangalore, we have a track record of creating high-impact, scalable products used by 370 million daily users. The collective net worth of our client companies stands at $45.7 billion since our inception in 2019. At Proximity, we are a diverse team of coders, designers, product managers, and experts dedicated to solving complex problems and developing cutting-edge technology at scale. As our team of Proxonauts continues to expand rapidly, your contributions will play a significant role in the company's success. You will have the opportunity to collaborate with experienced leaders who have spearheaded multiple tech, product, and design teams. To learn more about us, you can watch our CEO, Hardik Jagda, share insights about Proximity, explore our values and meet our team members, visit our website, blog, and design wing at Studio Proximity, and gain behind-the-scenes access through our Instagram accounts @ProxWrks and @H.Jagda.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
A career at Conga is more than just a job - it's a complete package! At Conga, we have fostered a community where our colleagues can flourish. Here, you will have the chance to innovate, receive support for your growth through both individual and team development, and work in an environment where every voice is valued and heard. Conga specializes in simplifying complexity within an ever-evolving world. Our revenue lifecycle management solution is designed to streamline order configuration, execution, fulfillment, and contract renewal processes by utilizing a single critical insights data model that adapts to changing business requirements and aligns the efforts of all teams. Our mission at Conga is to empower customers to achieve transformational revenue growth by harmonizing teams, processes, and technology to maximize customer lifetime value. The Software Architect role at Conga is a pivotal position within the Platform and AI Services team. This team is dedicated to building and innovating foundational services, components, and frameworks essential for the SaaS Revenue Lifecycle Management platform powered by AI. As an AI Architect at Conga, you will play a key role in developing AI capabilities for the Conga Revenue Lifecycle Platform, catering to customer needs and expanding your own skill set. Your responsibilities will involve building and supporting high-scale production code in a multi-tenant SaaS environment. One of the key aspects of this role is to contribute to the architecture and development of core AI services for the Revenue Lifecycle Platform. As an AI Architect, you will be involved in designing end-to-end AI solutions, developing data pipelines, models, deployment strategies, and integrating with existing systems. Moreover, you will collaborate with data scientists and software engineers to implement robust, scalable, and efficient AI solutions. Your role will also encompass managing the technical architecture of the AI platform, ensuring scalability, performance, security, and cost efficiency. Additionally, you will actively participate in an agile team lifecycle, including design, development, testing, planning, backlog grooming, and support. To excel in this role, you should possess a Bachelor's degree in Computer Science or a related field, with 10+ years of expertise in areas such as Machine Learning, Pattern Recognition, Natural Language Processing, Information Retrieval, Large Scale Distributed Systems, and Cloud Computing. We are looking for a talented individual who can provide technical leadership, design reusable components and services, stay abreast of the latest advancements in AI, and possess strong communication and interpersonal skills. If you are self-driven, enjoy problem-solving, and are willing to work with a global team, then we encourage you to apply for this exciting opportunity at Conga. If you believe you are the right candidate for this role or have a genuine interest in working in an inclusive and diverse workplace, we invite you to submit your application. Your resume can be in any format, but we recommend using PDF or plain text for easier review by our recruiters. At Conga, we value diversity and inclusivity, so even if you do not meet every qualification listed, we encourage you to apply as you may still be the ideal candidate for this or other roles.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
You are an experienced BI Architect with a strong background in Power BI and the Microsoft Azure ecosystem. Your main responsibility will be to design, implement, and enhance business intelligence solutions that aid in strategic decision-making within the organization. You will play a crucial role in leading the BI strategy, architecture, and governance processes, while also guiding a team of BI developers and Data analysts. Your key responsibilities will include designing and implementing scalable BI solutions using Power BI and Azure services, defining BI architecture, data models, security models, and best practices for enterprise reporting. You will collaborate closely with business stakeholders to gather requirements and transform them into data-driven insights. Additionally, you will oversee data governance, metadata management, and Power BI workspace design, optimizing Power BI datasets, reports, and dashboards for performance and usability. Furthermore, you will be expected to establish standards for data visualization, development lifecycle, version control, and deployment. As a mentor to BI developers, you will ensure adherence to coding and architectural standards, integrate Power BI with other applications using APIs, Power Automate, or embedded analytics, and monitor and troubleshoot production BI systems to maintain high availability and data accuracy. To qualify for this role, you should have a minimum of 12 years of overall experience with at least 7 years of hands-on experience with Power BI, including expertise in data modeling, DAX, M/Power Query, custom visuals, and performance tuning. Strong familiarity with Azure services such as Azure SQL Database, Azure Data Lake, Azure Functions, and Azure DevOps is essential. You must also possess a solid understanding of data warehousing, ETL, and dimensional modeling concepts, along with proficiency in SQL, data transformation, and data governance principles. Experience in managing enterprise-level Power BI implementations with large user bases and complex security requirements, excellent communication and stakeholder management skills, the ability to lead cross-functional teams, and influence BI strategy across departments are also prerequisites for this role. Knowledge of Microsoft Fabric architecture and its components, a track record of managing BI teams of 6 or more, and the capability to provide technical leadership and team development are highly desirable. In addition, having the Microsoft Fabric Certification DP 600 and PL-300 would be considered a bonus for this position.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
You are seeking an Analytics Developer with expertise in Databricks, Power BI, and ETL technologies to design, develop, and deploy advanced analytics solutions. Your focus will be on creating robust, scalable data pipelines, implementing actionable business intelligence frameworks, and delivering insightful dashboards and reports to drive strategic decision-making. This role involves close collaboration with technical teams and business stakeholders to ensure analytics initiatives align with organizational objectives. With 8+ years of experience in analytics, data integration, and reporting, you should possess 4+ years of hands-on experience with Databricks, including proficiency in Databricks Notebooks for development and testing. Your key responsibilities will include leveraging Databricks to develop and optimize scalable data pipelines for real-time and batch data processing, designing and implementing Databricks Notebooks for exploratory data analysis, ETL workflows, and machine learning models, managing and optimizing Databricks clusters for performance, cost efficiency, and scalability, using Databricks SQL for advanced query development, data aggregation, and transformation, incorporating Python and/or Scala within Databricks workflows to automate and enhance data engineering processes, developing solutions to integrate Databricks with other platforms such as Azure Data Factory for seamless data orchestration, creating interactive and visually compelling Power BI dashboards and reports to enable self-service analytics, leveraging DAX for building calculated columns, measures, and complex aggregations, designing effective data models in Power BI using star schema and snowflake schema principles for optimal performance, configuring and managing Power BI workspaces, gateways, and permissions for secure data access, implementing row-level security and data masking strategies in Power BI to ensure compliance with governance policies, building real-time dashboards by integrating Power BI with Databricks, Azure Synapse, and other data sources, providing end-user training and support for Power BI adoption across the organization, developing and maintaining ETL/ELT workflows ensuring high data quality and reliability, implementing data governance frameworks to maintain data lineage, security, and compliance with organizational policies, optimizing data flow across multiple environments including data lakes, warehouses, and real-time processing systems, collaborating with data governance teams to enforce standards for metadata management and audit trails, working closely with IT teams to integrate analytics solutions with ERP, CRM, and other enterprise systems, troubleshooting and resolving technical challenges related to data integration, analytics performance, and reporting accuracy, staying updated on the latest advancements in Databricks, Power BI, and data analytics technologies, driving innovation by integrating AI/ML capabilities into analytics solutions using Databricks, contributing to the enhancement of organizational analytics maturity through scalable and reusable approaches. You should possess self-management skills, thinking outside the box, learning new technologies, logical thinking, fluency in English, strong communication skills, a Bachelor's degree in Computer Science, Data Science, or a related field (Masters preferred), relevant certifications, and the ability to manage multiple priorities in a fast-paced environment with high customer expectations.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As an AWS Data Engineer with a focus on Databricks, you will play a crucial role in designing, developing, and optimizing scalable data pipelines. Your expertise in Databricks, PySpark, and AWS development will be key in leading technical efforts and driving innovation across the stack. Your responsibilities will include developing and optimizing data pipelines using Databricks (PySpark), implementing AWS AppSync and Lambda-based APIs for integration with Neptune and OpenSearch, collaborating with React developers and backend teams to enhance architecture, ensuring secure development practices especially around IAM roles and AWS security, driving performance, scalability, and reliability improvements, and taking full ownership of assigned tasks and deliverables. To excel in this role, you should have strong experience in Databricks and PySpark for building data pipelines, proficiency in AWS Neptune and OpenSearch, hands-on experience with AWS AppSync and Lambda functions, a solid grasp of IAM, CloudFront, and API development in AWS, familiarity with React.js front-end applications (a plus), strong problem-solving, debugging, and communication skills, and the ability to work independently and drive innovation. Preferred qualifications include AWS Certifications (Solutions Architect, Developer Associate, or Data Analytics Specialty) and production experience with graph databases and search platforms. This position offers a great opportunity to work with cutting-edge technologies, collaborate with talented teams, and make a significant impact on data engineering projects.,
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
karnataka
On-site
As a Senior Data Modeller, you will be responsible for leading the design and development of conceptual, logical, and physical data models for enterprise and application-level databases. Your expertise in data modeling, data warehousing, and data governance, particularly in cloud environments, Databricks, and Unity Catalog, will be crucial for the role. You should have a deep understanding of business processes related to master data management in a B2B environment and experience with data governance and data quality concepts. Your key responsibilities will include designing and developing data models, translating business requirements into structured data models, defining and maintaining data standards, collaborating with cross-functional teams to implement models, analyzing existing data systems for optimization, creating entity relationship diagrams and data flow diagrams, supporting data governance initiatives, and ensuring compliance with organizational data policies and security requirements. To be successful in this role, you should have at least 12 years of experience in data modeling, data warehousing, and data governance. Strong familiarity with Databricks, Unity Catalog, and cloud environments (preferably Azure) is essential. Additionally, you should possess a background in data normalization, denormalization, dimensional modeling, and schema design, along with hands-on experience with data modeling tools like ERwin. Experience in Agile or Scrum environments, proficiency in integration, databases, data warehouses, and data processing, as well as a track record of successfully selling data and analytics software to enterprise customers are key requirements. Your technical expertise should cover Big Data, streaming platforms, Databricks, Snowflake, Redshift, Spark, Kafka, SQL Server, PostgreSQL, and modern BI tools. Your ability to design and scale data pipelines and architectures in complex environments, along with excellent soft skills including leadership, client communication, and stakeholder management will be valuable assets in this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Engineer at Ethoca, a Mastercard Company in Pune, India, you will play a crucial role in driving data enablement and exploring big data solutions within our technology landscape. Your responsibilities will include designing, developing, and optimizing batch and real-time data pipelines using tools such as Snowflake, Snowpark, Python, and PySpark. You will also be involved in building data transformation workflows, implementing CI/CD pipelines, and administering the Snowflake platform to ensure performance tuning, access management, and platform scalability. Collaboration with stakeholders to understand data requirements and deliver reliable data solutions will be a key part of your role. Your expertise in cloud-based database infrastructure, SQL development, and building scalable data models using tools like Power BI will be essential in supporting business analytics and dashboarding. Additionally, you will be responsible for real-time data streaming pipelines, data observability practices, and planning and executing deployments, migrations, and upgrades across data platforms while minimizing service impacts. To be successful in this role, you should have a strong background in computer science or software engineering, along with deep hands-on experience with Snowflake, Snowpark, Python, PySpark, and CI/CD tooling. Familiarity with Schema Change, Java JDK, Spring & Springboot framework, Databricks, and real-time data processing is desirable. You should also possess excellent problem-solving and analytical skills, as well as effective written and verbal communication abilities for collaborating across technical and non-technical teams. You will be part of a high-performing team that is committed to making systems resilient and easily maintainable on the cloud. If you are looking for a challenging role that allows you to leverage cutting-edge software and development skills while working with massive data volumes, this position at Ethoca may be the right fit for you.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for performing comprehensive testing of ETL pipelines to ensure data accuracy and completeness across different systems. This includes validating Data Warehouse objects such as fact and dimension tables, designing and executing test cases and test plans for data extraction, transformation, and loading processes, as well as conducting regression testing to validate enhancements with no breakage of existing data flows. You will also work with SQL to write complex queries for data verification and backend testing, and test data processing workflows in Azure Data Factory and Databricks environments. Collaboration with developers, data engineers, and business analysts to understand requirements and proactively raise defects is a key part of this role. Additionally, you will be expected to perform root cause analysis for data-related issues and suggest improvements, as well as create clear and concise test documentation, logs, and reports. The ideal candidate for this position should possess strong knowledge of ETL testing methodologies and tools, excellent skills in SQL including joins, aggregation, subqueries, and performance tuning, hands-on experience with Data Warehousing and data models (Star/Snowflake), and experience in test case creation, execution, defect logging, and closure. Proficiency in regression testing, data validation, and data reconciliation is also required, as well as a working knowledge of Azure Data Factory (ADF), Azure Synapse, and Databricks. Experience with test management tools like JIRA, TestRail, or HP ALM is essential. Nice to have qualifications include exposure to automation testing for data pipelines, scripting knowledge in Python or PySpark, understanding of CI/CD in data testing, and experience with data masking, data governance, and privacy rules. To qualify for this role, you should have a Bachelors degree in Computer Science, Information Systems, or a related field, along with at least 3 years of hands-on experience in ETL/Data Warehouse testing. Excellent analytical and problem-solving skills, strong attention to detail, and good communication skills are also necessary for this position.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
You will be responsible for leading the design and implementation of an Azure-based digital and AI platform that facilitates scalable and secure product delivery across IT and OT domains. In collaboration with the Enterprise Architect, you will shape the platform architecture to ensure alignment with the overall digital ecosystem. Your role will involve integrating OT sensor data from PLCs, SCADA, and IoT devices into a centralized and governed Lakehouse environment, bridging plant-floor operations with cloud innovation. Key Responsibilities: - Architect and implement the Azure digital platform utilizing IoT Hub, IoT Edge, Synapse, Databricks, and Purview. - Work closely with the Enterprise Architect to ensure that platform capabilities align with the broader enterprise architecture and digital roadmap. - Design data ingestion flows and edge-to-cloud integration from OT systems such as SCADA, PLC, MQTT, and OPC-UA. - Establish platform standards for data ingestion, transformation (Bronze, Silver, Gold), and downstream AI/BI consumption. - Ensure security, governance, and compliance in accordance with standards like ISA-95 and the Purdue Model. - Lead the technical validation of platform components and provide guidance on platform scaling across global sites. - Implement microservices architecture patterns using containers (Docker) and orchestration (Kubernetes) to enhance platform modularity and scalability. Requirements: - Possess a minimum of 8 years of experience in architecture or platform engineering roles. - Demonstrated hands-on expertise with Azure services including Data Lake, Synapse, Databricks, IoT Edge, and IoT Hub. - Deep understanding of industrial data protocols such as OPC-UA, MQTT, and Modbus. - Proven track record of designing IT/OT integration solutions in manufacturing environments. - Familiarity with Medallion architecture, time-series data, and Azure security best practices. - TOGAF or Azure Solutions Architect certification is mandatory for this role.,
Posted 1 week ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
10.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Are you insatiably curious, deeply passionate about the realm of databases and analytics, and ready to tackle complex challenges in a dynamic environment in the era of AI? If so, we invite you to join our team as a Cloud & AI Solution Engineer in Innovative Data Platform for commercial customers at Microsoft. Here, you'll be at the forefront of innovation, working on cutting-edge projects that leverage the latest technologies to drive meaningful impact. Join us and be part of a team that thrives on collaboration, creativity, and continuous learning. Databases & Analytics is a growth opportunity for Microsoft Azure, as well as its partners and customers. It includes a rich portfolio of products including IaaS and PaaS services on the Azure Platform in the age of AI. These technologies empower customers to build, deploy, and manage database and analytics applications in a cloud-native way. As an Innovative Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Responsibilities Drive technical sales with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Qualifications 10+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 1 week ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Are you passionate about crafting engaging web applications, from front end interfaces to robust back-end systems? Join our team and take ownership of the full development lifecycle. We are hiring! Bring your skills in front-end technologies, back-end development to our dynamic team! In this role, you we are looking - • 6+ years of back-end development experience, including: • 5+ years in cloud-native development using AWS • Strong proficiency in Nodejs, TypeScript, and REST APIs • Experience with AWS CloudFront, S3, Aurora PostgreSQL • Demonstrated experience leading small teams or engineering squads • Deep understanding of microservice architectures • Familiarity with Reactjs, Nextjs, and Auth0 integration • Experience working in agile environments using Jira and Confluence • Strong communication skills and ability to influence cross-functional stakeholders • Develop and maintain REST APIs to support various applications and services • Ensure secure and efficient access/login mechanisms using Auth0 • Collaborate with cross-functional teams to define, design, and ship new features • Mentor and guide junior developers, fostering a culture of continuous learning and improvement • Conduct code reviews and ensure adherence to best practices and coding standards • Troubleshoot and resolve technical issues, ensuring high availability and performance of applications • Stay updated with the latest industry trends and technologies to drive innovation within the team Preferred Qualifications • Experience with enterprise-scale, web-based applications • Exposure to Databricks or other large-scale data platforms (no direct engineering required) • Previous experience rotating between teams and adapting to different scopes and responsibilities Want to know more about this position, or know someone in your network that would be great or this role? Apply now and let's shape the future together!
Posted 1 week ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title Databricks Engineer Location [NCR / Bengaluru] Job Type [Full-time] Experience Level 4+ years in data engineering with a strong focus on Databricks Domain [Healthcare] Job Summary We are seeking a highly skilled and motivated Databricks Engineer to join our data engineering team. The ideal candidate will have strong experience in designing, developing, and optimizing large-scale data pipelines and analytics solutions using the Databricks Unified Analytics Platform, Apache Spark, Delta Lake, Data Factory and modern data lake/lakehouse architectures. You will work closely with data architects, data scientists, and business stakeholders to enable high-quality, scalable, and reliable data processing frameworks that support business intelligence, advanced analytics, and machine learning initiatives. Key Responsibilities Design and implement batch and real-time ETL/ELT pipelines using Databricks and Apache Spark. Ingest, transform, and deliver structured and semi-structured data from diverse data sources (e.g., file systems, databases, APIs, event streams). Develop reusable Databricks notebooks, jobs, and libraries for repeatable data workflows. Implement and manage Delta Lake solutions to support ACID transactions, time-travel, and schema evolution. Ensure data integrity through validation, profiling, and automated quality checks. Apply data governance principles, including access control, encryption, and data lineage, using available tools (e.g., Unity Catalog, external metadata catalogs). Work with data scientists and analysts to deliver clean, curated, and analysis-ready data. Profile and optimize Spark jobs for performance, scalability, and cost. Monitor, debug, and troubleshoot data pipelines and distributed processing issues. Set up alerting and monitoring for long-running or failed jobs. Participate in the CI/CD lifecycle using tools like Git, GitHub Actions, Jenkins, or Azure DevOps. Required Skills & Experience 4+ years of experience in data engineering. Strong hands-on experience with Apache Spark (DataFrames, Spark SQL, RDDs, Structured Streaming). Proficient in Python (PySpark) and SQL for data processing and transformation. Understanding of Cloud environment (Azure & AWS). Solid understanding of Delta Lake, Data Factory and Lakehouse architecture. Experience working with various data formats such as Parquet, JSON, Avro, CSV. Familiarity with DevOps practices, version control (Git), and CI/CD pipelines for data workflows. Experience with data modeling, dimensional modeling, and data warehouse concepts.
Posted 1 week ago
4.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Key Attributes: ü - Experience of implementing and delivering data solutions and pipelines on AWS Cloud Platform. Design/ implement, and maintain the data architecture for all AWS data services ü - A strong understanding of data modelling, data structures, databases (Redshift), and ETL processes ü - Work with stakeholders to identify business needs and requirements for data-related projects ü - Strong SQL and/or Python or PySpark knowledge ü - Creating data models that can be used to extract information from various sources & store it in a usable format ü - Optimize data models for performance and efficiency ü - Write SQL queries to support data analysis and reporting ü - Monitor and troubleshoot data pipelines ü - Collaborate with software engineers to design and implement data-driven features ü - Perform root cause analysis on data issues ü - Maintain documentation of the data architecture and ETL processes ü - Identifying opportunities to improve performance by improving database structure or indexing methods ü - Maintaining existing applications by updating existing code or adding new features to meet new requirements ü - Designing and implementing security measures to protect data from unauthorized access or misuse ü - Recommending infrastructure changes to improve capacity or performance ü - Experience in Process industry Mandatory skill sets: Data Modelling, AWS, ETL Preferred skill sets: Data Modelling, AWS, ETL Years of experience required: 4-8 Years Education qualification: BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills ETL Development Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 1 week ago
8.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description You are a strategic thinker passionate about driving solutions in Data Governance. You have found the right team. As a Data Governance Associate in our Finance team, you will spend each day defining, refining, and delivering set goals for our firm. In your role as a Senior Associate in the CAO – Data Governance Team, you will execute data quality initiatives and contribute to data governance practices, including data lineage, contracts, and classification. Under the guidance of the VP, you will ensure data integrity and compliance, utilizing cloud platforms, data analytics tools, and SQL expertise. You will be part of a team that provides resources and support to manage data risks globally, lead strategic data projects, and promote data ownership within JPMC’s Chief Administrative Office. Job Responsibilities Collaborate with leadership and stakeholders to support the CAO Data Governance program by facilitating communication and ensuring alignment with organizational goals. Implement and maintain a data quality operating model, including standards, rules, and processes, to ensure prioritized data is fit for purpose and meets business needs. Manage the data quality issue management lifecycle, coordinating between CDO, application owners, data owners, information owners, and other stakeholders to ensure timely resolution and continuous improvement. Align with evolving firmwide CDAO Data Quality policies, standards, and best practices, incorporating requirements into the CAO CDAO data governance framework to ensure compliance and consistency. Implement data governance frameworks on CAO Data Lake structures to enhance data accessibility, usability, and integrity across the organization. Required Qualifications, Capabilities, And Skills 8+ years of experience in data quality management or data governance within financial services. Experience with data management tools such as Talend, Alteryx, Soda, Collibra. Experience with visualization tools like Tableau and Qlik Sense. Experience with Agile/Scrum methodologies and tools (Confluence, Jira). Familiarity with Microsoft desktop productivity tools (Excel, PowerPoint, Visio, Word, SharePoint, Teams). Preferred Qualifications, Capabilities, And Skills Lean/Six Sigma experience is a plus. Proficiency in cloud platforms like GCP and AWS, with data lake implementation experience. Experience with Databricks or similar for data processing and analytics. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our professionals in our Corporate Functions cover a diverse range of areas from finance and risk to human resources and marketing. Our corporate teams are an essential part of our company, ensuring that we’re setting our businesses, clients, customers and employees up for success.
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Preferred Education Master's Degree Required Technical And Professional Expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred Technical And Professional Experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences.
Posted 1 week ago
7.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do The Global Information and AI Security Senior Manager provides internal BCG technical consulting around information security architecture and security design measures for new projects, ventures and systems. The architect defines the desired end state to meet solution Security Goals and overall business goals. The Security Architect ensures the digital applications, tools, and services protect our data, our clients’ data, and our intellectual property; are resilient to cyber-attack; meet BCG policy and standards, regulatory requirements, and industry best practices; while using a risk-based approach to meeting BCG business needs and objectives. The Global Information and AI Security Senior Manager works with teams inside BCG to secure the building and maintenance of complex computing environments to train, deploy, and operate Artificial Intelligence/ML systems by determining security requirements; planning, implementing and testing security systems; participate in AI/ML/LLM projects as the Security Subject Matter Expert; preparing security standards, policies and procedures; and mentoring team members. What You'll Bring Bachelor's degree (or equivalent experience) required. CSSLP certification required; additional certifications such as CISSP, CCSP, or CCSK strongly preferred. 7+ years of progressive experience in information security, specifically focused on secure architecture, secure development practices, and cloud-native security. Proven expertise supporting software engineering, data science, and AI/ML development teams, specifically with secure model lifecycle management, secure deployment practices, and secure data engineering. Expert understanding of the Secure Software Development Lifecycle (SSDLC), including secure architecture, threat modeling frameworks (e.g., MAESTRO, PASTA, STRIDE), penetration testing, secure coding practices, vulnerability management, and incident response. Demonstrated technical proficiency across multiple security technologies, platforms, and frameworks, with strong hands-on experience implementing secure cloud-native infrastructures (AWS, Azure, GCP). Familiarity with data warehouse and data lake environments such as Databricks, Azure Fabric, or Snowflake, including security best practices in managing and securing large-scale data ecosystems. In-depth knowledge and practical experience with AI and machine learning model security, ethical AI frameworks, secure handling of data, and comprehensive understanding of CI/CD pipelines specifically tailored for data science workloads. Extensive experience conducting security assessments, vulnerability triage, intrusion detection and prevention, firewall management, network vulnerability analysis, cryptographic implementations, and incident response analysis. Exceptional communication skills (written and oral), influencing capabilities, and ability to clearly articulate complex security concepts to stakeholders across various levels of the organization. Proactive professional development, continuous learning, active participation in industry forums, professional networks, and familiarity with current and emerging security trends and standards. Additional info YOU'RE GOOD AT The Senior Manager, Security And AI Architect Excels At Collaborating closely with software engineering, data science, data engineering, and cybersecurity teams to design, implement, and maintain secure solutions in agile environments leveraging cloud-native technologies and infrastructure. Defining security requirements by deeply understanding business objectives, evaluating strategies, and implementing robust security standards throughout the full Software Development Life Cycle (SDLC). Leading security risk assessments, threat modeling (utilizing frameworks such as MAESTRO, PASTA, STRIDE, etc.), security architecture reviews, and vulnerability analyses for client-facing digital products, particularly involving complex AI/ML-driven solutions. Advising development teams, including AI engineers and data scientists, on secure coding practices, secure data handling, secure AI/ML model deployment, and related infrastructure security considerations. Providing specialized guidance on secure AI model development lifecycle, including secure data usage, ethical AI practices, and robust security controls in Generative AI and large language model deployments. Actively participating in the APAC Dex process for managing digital builds, ensuring alignment with regional requirements, standards, and best practices. Staying ahead of emerging security trends and technologies, conducting continuous research, evaluation, and advocacy of new security tools, frameworks, and architectures relevant to digital solutions. Ensuring robust compliance with regulatory frameworks and industry standards, including ISO 27001, SOC2, NIST, and GDPR, particularly as they pertain to data privacy and AI-driven product development. Developing and delivering training programs on secure development, AI security considerations, and incident response practices. Partnering with internal stakeholders, articulating security risks clearly, influencing technical directions, and promoting comprehensive secure architecture roadmaps. Conducting vendor and market assessments, guiding tests, evaluations, and implementation of security products that address enterprise and client-specific information security requirements. Advising teams on compensating controls and alternative security measures to facilitate business agility without compromising security posture. Leading the implementation and continuous improvement of security tooling and practices within CI/CD pipelines, infrastructure-as-code (IaC), and model deployment automation. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
Posted 1 week ago
3.0 - 5.0 years
5 - 8 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
As a Senior Azure Data Engineer, your responsibilities will include: Building scalable data pipelines using Databricks and PySpark Transforming raw data into usable business insights Integrating Azure services like Blob Storage, Data Lake, and Synapse Analytics Deploying and maintaining machine learning models using MLlib or TensorFlow Executing large-scale Spark jobs with performance tuning on Spark Pools Leveraging Databricks Notebooks and managing workflows with MLflow Qualifications: Bachelors/Masters in Computer Science, Data Science, or equivalent 7+ years in Data Engineering, with 3+ years in Azure Databricks Strong hands-on in: PySpark, Spark SQL, RDDs, Pandas, NumPy, Delta Lake Azure ecosystem: Data Lake, Blob Storage, Synapse Analytics Contract Position Location Remote Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad
Posted 1 week ago
3.0 - 6.0 years
0 Lacs
Thane, Maharashtra, India
On-site
Your responsibilities We are seeking a highly experienced Senior Data Engineer with expertise in Azure-based cloud architecture to join our team. In this role, you will design, build, and optimize complex data pipelines and cloud infrastructure to support data-driven business decisions. You’ll be responsible for implementing robust, scalable solutions leveraging Azure Synapse Analytics, Databricks, ADF, SQL with DevOps. The ideal candidate will also possess experience in Power BI, data mining, data analysis, and migration of on-premises systems to the cloud. Familiarity with Microsoft Fabric is an added advantage. Key Responsibilities Cloud Architecture & Infrastructure: Design and implement Azure-based cloud infrastructure, including data storage, processing, and analytics components. Develop and optimize scalable data architectures to ensure high performance and availability. Data Pipeline Development: Build and manage ETL/ELT pipelines using Azure Synapse, Azure Data Factory (ADF), and Databricks. Ensure the efficient flow of data from source to target systems, implementing robust data quality controls. Data Transformation & Analysis: Utilize SQL, Synapse, and Databricks for data transformation, data mining, and advanced data analysis. Implement best practices for data governance, lineage, and security within Azure environments. Migration Projects: Lead the migration of on-premises data systems to Azure cloud infrastructure, ensuring minimal disruption and data integrity. Optimize data migration strategies and methodologies for various applications and workloads. DevOps & CI/CD Pipelines: Manage DevOps processes, ensuring continuous integration and deployment (CI/CD) for data solutions. Develop and maintain infrastructure as code (IaC) for deployment, testing, and monitoring. Business Intelligence & Reporting: Collaborate with business stakeholders to design and implement reporting solutions in Power BI, ensuring data is accessible and actionable. Develop visualizations and dashboards to support data-driven decision-making. Collaboration & Best Practices: Work closely with data scientists, analysts, and other business stakeholders to understand requirements and provide optimized data solutions. Drive best practices in data engineering, including coding standards, testing, version control, and documentation. Your profile Candidate must have 3 - 6 years of experience Education: Bachelor’s or master’s degree in computer science, Information Technology, Data Engineering, or related field. Technical Expertise: Azure Cloud: Advanced proficiency in Azure services, including Synapse Analytics, Data Factory, Databricks, SQL, Blob Storage, and Data Lake. Data Engineering: Strong skills in SQL, ETL/ELT processes, data warehousing, and data modeling. DevOps: Experience with CI/CD pipeline setup, automation, and Azure DevOps. Data Analysis & BI: Proficient in data analysis and visualization using Power BI; experience in data mining techniques is desirable. Migration Experience: Proven track record of migrating on-premises systems to Azure cloud. Additional Skills: Knowledge of Microsoft Fabric is a plus. Familiarity with Infrastructure as Code (IaC) tools like ARM templates or Terraform. Strong understanding of data governance, security, and compliance best practices. Soft Skills: Strong problem-solving and analytical skills. Excellent communication skills, with the ability to collaborate effectively across teams. Ability to manage multiple priorities and work independently in a dynamic environment. Work location: Thane (Mumbai) Your benefits Company Home - thyssenkrupp Materials Services (thyssenkrupp-materials-services.com) Contact Vinit Poojary - tkmits-in-recruitment@thyssenkrupp-materials.com
Posted 1 week ago
4.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Description We have an outstanding opportunity for an expert AI and Data Cloud Solutions Engineer to work with our trailblazing customers in crafting ground-breaking customer engagement roadmaps demonstrating the Salesforce applications, platform across the machine learning and LLM/GPT domains in India! The successful applicant will have a track record in driving business outcomes through technology solutions, with experience in engaging at the C-level with Business and Technology groups. Responsibilities Primary pre-sales technical authority for all aspects of AI usage within the Salesforce product portfolio - existing Einstein ML based capabilities and new (2023) generative AI Majority of time (60%+) will be customer/external facing Evangelisation of Salesforce AI capabilities Assessing customer requirements and use cases and aligning to these capabilities Solution proposals, working with Architects and wider Solution Engineer (SE) teams Preferred Qualifications expertise in an AI related subject (ML, deep learning, NLP etc.) Familiar with technologies such as OpenAI, Google Vertex, Amazon Sagemaker, Snowflake, Databricks etc Required Qualifications: Experience will be evaluated based on the core proficiencies of the role. 4+ years working directly in the commercial technology space with AI products and solutions. Data knowledge - Data science, Data lakes and warehouses, ETL, ELT, data quality AI knowledge - application of algorithms and models to solve business problems (ML, LLMs, GPT) 10+ years working in a sales, pre-sales, consulting or related function in a commercial software company Strong focus and experience in pre-sales or implementation is required. Experience in demonstrating Customer engagement solution, understand and drive use cases, customer journeys, ability to draw ‘Day in life of’ across different LOBs. Business Analysis/ Business case/return on investment construction. Demonstrable experience in presenting and communicating complex concepts to large audiences A broad understanding of and ability to articulate the benefits of CRM, Sales, Service and Marketing cloud offerings Strong verbal and written communications skills with a focus on needs analysis, positioning, business justification, and closing techniques. Continuous learning demeanor with a demonstrated history of self enablement and advancement in both technology and behavioural areas. Building reference models/ideas/approaches for inclusion of GPT based products within wider Salesforce solution architectures, especially involving Data Cloud Alignment with customer security and privacy teams on trust capabilities and values of our solution(s) Presenting at multiple customer events from single account sessions through to major strategic events (World Tour, Dreamforce) Representing Salesforce at other events (subject to PM approval) Sales and SE organisation education and enablement e.g. roadmap - all roles across all product areas Bridge/primary contact point to product management Provide thought leadership in how large enterprise organisation can drive customer success through digital transformation. Ability to uncover the challenges and issues a business is facing by running successful and targeted discovery sessions and workshops. Be an innovator who can build new solutions using out-of-the-box thinking. Demonstrate business value of our AI solutions to business using solution presentations, demonstrations and prototypes. Build roadmaps that clearly articulate how partners can implement and accept solutions to move from current to future state. Deliver functional and technical responses to RFPs/RFIs. Work as an excellent teammate by chipping in, learning and sharing new knowledge. Demonstrate a conceptual knowledge of how to integrate cloud applications to existing business applications and technology. Lead multiple customer engagements concurrently. Be self-motivated, flexible, and take initiative.|
Posted 1 week ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Sr. Manager - Data Platform, Engineering - Hyderabad, India . About Warner Bros. Discovery Warner Bros. Discovery, a premier global media and entertainment company, offers audiences the world's most differentiated and complete portfolio of content, brands and franchises across television, film, streaming and gaming. The new company combines Warner Media’s premium entertainment, sports and news assets with Discovery's leading non-fiction and international entertainment and sports businesses. For more information, please visit www.wbd.com. Roles & Responsibilities As an Engineering Manager here you are passionate about using software-based approaches to solve complex data-driven challenges and automate those solutions. Within our organization, you’ll lead efforts aimed at scaling our existing data offerings and establish the technical strategy for how we can better equip engineers and leaders with Data Platform. You’ll build a deep understanding of our digital streaming service and use that knowledge, coupled with your engineering, infrastructure, data, and cloud knowledge, to optimize and evolve how we understand our technical ecosystem. To be successful, you’ll need to be deeply technical and capable of holding your own with other strong peers. You possess excellent collaboration and diplomacy skills. You have experience practicing infrastructure-as-code, data lake management, AI/ML Knowledge, and Analytics. In addition, you’ll have strong systems knowledge and troubleshooting abilities. Develop streaming and batch analytics & pipelines to build data impactful data products around a semantic layer Create tools and frameworks that enhance data processing, information retrieval, governance, and data quality in a cost-effective and user-friendly manner. Promote a culture of experimentation and data-driven innovation. Inspire and motivate through internal and external presentations and other speaking opportunities. Own the end-to-end architecture of the data platform, ensuring its efficiency, cost-effectiveness, security, and governance. Collaborate closely with other engineers to design and build an optimal and cost-efficient platform solution. Work in partnership with other engineers and managers to design and develop foundational elements of the platform. Assist in hiring, mentoring, and coaching engineers. Help build an engineering team that prioritizes empathy, diversity and inclusion. What To Bring Bachelor’s degree in computer science or similar discipline 12+ years of commendable track record of delivering complex software engineering systems and distributed platforms using open source technologies 12+ years of experience and proficiency in building and managing real-time data processing pipelines with streaming platforms like Apache Kafka, AWS Kinesis, or Google Pub/Sub. 10+ years of strong foundation in distributed data processing concepts, event-driven architectures, understanding of batch and stream processing technologies and building streaming and batch pipelines 12+ years of programming experience with proficiency in Java, C, C++ or similar languages. 8+ years of experience in wide variety of distributed systems and technologies – such as Apache Flink, Apache Spark, Kafka, Airflow, Kubernetes, Databricks, Snowflake 10+ years of Cloud (AWS preferred) experience 10+ years of experience with containerization (Docker) and orchestration tools (Kubernetes), Prometheus, Grafana, Kibana, Elasticsearch, Cassandra, DynamoDB 10+ years of experience to design scalable and resilient streaming applications with microservices architecture Experience leading in a highly cross-functional environment, likely collaborating closely with Engineering, Product Management, and/or Data Science Strong interpersonal, communication and presentation skills. Nice to Have Exposure to tools like Apache Beam or Spark Streaming. Familiarity with integrating ML models into Flink pipelines is a plus. What We Offer A Great Place to work. Equal opportunity employer Fast track growth opportunities How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi