Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 10.0 years
22 - 27 Lacs
Bengaluru
Work from Office
At Elanco (NYSE: ELAN) it all starts with animals! As a global leader in animal health, we are dedicated to innovation and delivering products and services to prevent and treat disease in farm animals and pets. We re driven by our vision of Food and Companionship Enriching Life and our approach to sustainability the Elanco Healthy Purpose to advance the health of animals, people, the planet and our enterprise. Making animals lives better makes life better join our team today! Your Role: Sr Data Engineer The data engineer s role is delivery focused. The person in this role will drive data pipeline and data product delivery through data- architecture, modeling, design, and development a professional grade solution on premise and/or Microsoft Azure cloud. Partner with data scientists and statisticians across Elanco global business functions to help prepare and transform their data into data products that further drive the scientific and/or business knowledge discovery, insights, and forecasting. Data engineers will be part of a highly collaborative and cross-functional team of technology and data experts working on solving complex scientific and business challenges in animal health using cutting edge data and analytics technologies. Your Responsibilities: Provide data engineering subject matter expertise and hands-on data- capture, ingestion, curation, and pipeline development expertise on Azure to deliver cloud optimized data solutions. Provide expert data PaaS on Azure storage; big data platform services; server-less architectures; Azure SQL DB; NoSQL databases and secure, automated data pipelines. Participate in data/data-pipeline architectural discussions to help build cloud native solutions or migrate existing data applications from on premise to Azure platform. Perform current state AS-IS and future state To-Be analysis. Participate and help develop data engineering community of practice as a global go-to expert panel/resource. Develop and evolve new or existing data engineering methods and procedures to create possible alternative, agile solutions to moderately complex problems. What You Need to Succeed (minimum qualifications): At least 2 years of data pipeline and data product design, development, delivery experience and deploying ETL/ELT solutions on Azure Data Factory . Education : Bachelors or higher degree in Computer Science or a related discipline. What will give you a competitive edge (preferred qualifications): Azure native data/big-data tools, technologies and services experience including Storage BLOBS, ADLS, Azure SQL DB, COSMOS DB , NoSQL and SQL Data Warehouse. Sound problem solving skills in developing data pipelines using Data Bricks , Stream Analytics and PowerBI. Minimum of 2 years of hands-on experience in programming languages, Azure and Big Data technologies such as PowerShell, C#, Java, Python, Scala, SQL , ADLS/Blob, Hadoop, Spark/SparkSQL, Hive , and streaming technologies like Kafka, EventHub etc. Additional Information: Travel: 0% Location: India, Bangalore Don t meet every single requirementStudies have shown underrecognized groups are less likely to apply to jobs unless they meet every single qualification. At Elanco we are dedicated to building a diverse and inclusive work environment. If you think you might be a good fit for a role but dont necessarily meet every requirement, we encourage you to apply. You may be the right candidate for this role or other roles! Elanco is an EEO/Affirmative Action Employer and does not discriminate on the basis of age, race, color, religion, gender, sexual orientation, gender identity, gender expression, national origin, protected veteran status, disability or any other legally protected status
Posted 1 month ago
12.0 - 22.0 years
20 - 32 Lacs
Pune, Chennai, Bengaluru
Work from Office
LOCATION-BENGALURU, CHENNAI,PUNE NOTE- Immediate Joiners only Core Qualifications 12+ years in software & data architecture with hands-on delivery. Agentic AI & AWS Bedrock (Must-Have): Practical experience designing, deploying, and operating Agentic AI solutions using AWS Bedrock & Bedrock Agents. Cloud-Native AWS Expertise: Deep knowledge across compute, storage, networking, and security. Modern Architectures: Proven success in defining stacks for microservices, event-driven systems, and data platforms (e.g., Snowflake, Databricks). DevOps & IaC: Skilled in CI/CD pipelines and Infrastructure as Code using Azure DevOps & Terraform. Data & Integration: Strong in data modeling, REST/GraphQL API design, ETL/ELT, CDC, and messaging integration. Stakeholder Engagement: Excellent communicator with ability to align tech solutions to business outcomes. Preferred: Experience in media or broadcasting. Familiar with Salesforce or enterprise iPaaS platforms. Certifications: AWS/Azure/GCP Architect, Salesforce Integration Architect, TOGAF Have questions? I'm happy to help just connect with me on 9899080360, email- admin@spearheadps.com
Posted 1 month ago
19.0 - 23.0 years
60 - 75 Lacs
Bengaluru, Delhi / NCR
Hybrid
Preferred candidate profile : Solution Architect Data to lead the design and implementation of scalable, secure, and high-performance data solutions.You will play a key role in defining the data architecture and strategy across enterprise platforms, ensuring alignment with business goals and IT standards. 18+ years of IT experience with Atleast 5 years of worknig as an Architect. Experience working as a Data Architect. Experience architecting reporting and analytics solutions. Expereince architecting AI & ML solutoins. Experience with Databricks.
Posted 1 month ago
8.0 - 13.0 years
32 - 45 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Job Title: Data Architect Location: Bangalore ,Hyderabad,Chennai, Pune, Gurgaon - hybrid- 2/3 days WFO Experience: 8+ years Position Overview: We are seeking a highly skilled and strategic Data Architect to design, build, and maintain the organizations data architecture. The ideal candidate will be responsible for aligning data solutions with business needs, ensuring data integrity, and enabling scalable and efficient data flows across the enterprise. This role requires deep expertise in data modeling, data integration, cloud data platforms, and governance practices. Key Responsibilities: Architectural Design: Define and implement enterprise data architecture strategies, including data warehousing, data lakes, and real-time data systems. Data Modeling: Develop and maintain logical, physical, and conceptual data models to support analytics, reporting, and operational systems. Platform Management: Select and oversee implementation of cloud and on-premises data platforms (e.g., Snowflake, Redshift, BigQuery, Azure Synapse, Databricks). Integration & ETL: Design robust ETL/ELT pipelines and data integration frameworks using tools such as Apache Airflow, Informatica, dbt, or native cloud services. Data Governance: Collaborate with stakeholders to implement data quality, data lineage, metadata management, and security best practices. Collaboration: Work closely with data engineers, analysts, software developers, and business teams to ensure seamless and secure data access. Performance Optimization: Tune databases, queries, and storage strategies for performance, scalability, and cost-efficiency. Documentation: Maintain comprehensive documentation for data structures, standards, and architectural decisions. Required Qualifications: Bachelors or master’s degree in computer science, Information Systems, or a related field. 5+ years of experience in data architecture, data engineering, or database development. Strong expertise in data modeling, relational and NoSQL databases (e.g., PostgreSQL, MySQL, MongoDB, Cassandra). Experience with modern data platforms and cloud ecosystems (AWS, Azure, or GCP). Hands-on experience with data warehousing solutions and tools (e.g., Snowflake, Redshift, BigQuery). Proficiency in SQL and data scripting languages (e.g., Python, Scala). Familiarity with data privacy regulations (e.g., GDPR, HIPAA) and security standards. Tech Stack AWS Cloud – S3, EC2, EMR, Lambda, IAM, Snowflake DB Databricks Spark/Pyspark, Python Good Knowledge of Bedrock and Mistral AI RAG & NLP LangChain and LangRAG LLMs Anthropic Claude, Mistral, LLaMA etc.,
Posted 1 month ago
1.0 - 6.0 years
3 - 8 Lacs
Lucknow
Work from Office
Product Manager - Backend Services & Integrations. Product Manager - Backend Services & Integrations. Role Overview: As the Backend Services and Integration Product Manager, you will own the technical backbone of Urban IQ s platform You ll drive the development of scalable, reliable backend services and integrations, ensuring our platform can seamlessly integrate with IoT devices and external systems. Key Responsibilities: Own and define platform capabilities, focusing on backend scalability, APIs , and data pipelines. Collaborate with engineering teams to implement backend solutions that integrate AI and IoT technologies. Lead feasibility testing for backend services, ensuring performance and scalability meet customer requirements. Develop product specifications and user stories for backend services, using tools like Jira for quick, iterative cycles. Engage with customers to understand technical integration needs, working with cross-functional teams to ensure successful deployments. Partner with sales and marketing teams to build technical collateral for market- facing initiatives, showcasing the platform s backend strengths. Drive collaboration across departments, ensuring alignment on technical capabilities, customer needs, and market opportunities. Qualifications: 1+ years of product management experience , with a focus on backend technologies and integrations. Hands-on experience working with scrum teams , driving technical projects from conception to delivery. Strong expertise in cloud platform , data architecture, API development, and integrations. Good understanding of AI concepts and how to use agentic frameworks in solving complex business and operational problems Proven ability to lead technical feasibility testing and develop product specifications in agile environments. Experience collaborating with sales, marketing, and engineering to build comprehensive product strategies. Excellent technical understanding, problem-solving skills, and leadership abilities. Engineering Degree (BS and/or Masters) and prior technical experience preferred MBA would be a big plus in this role
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Mumbai
Work from Office
Summary: Skima is seeking a skilled Data Engineer to join our dynamic team in Mumbai The ideal candidate will have 3 years of experience in data engineering, with a strong background in designing, building, and maintaining scalable data pipelines and systems As a Data Engineer at Skima, you will be responsible for developing and optimizing our data architecture, ensuring the reliability and efficiency of our data processes You will work closely with data scientists, analysts, and other stakeholders to support their data needs and drive data-driven decision-making across the organization The role requires proficiency in SQL, Python, and big data technologies such as Hadoop, Spark, and Kafka Experience with cloud platforms like AWS, Azure, or Google Cloud is highly desirable This is an in-office position, offering a competitive CTC range of 50,000 to 70,000 If you are passionate about data engineering and eager to contribute to a forward-thinking company, we encourage you to apply and become a part of Skimas innovative team. Responsibilities Design, build, and maintain scalable data pipelines and systems. Develop and optimize data architecture to ensure reliability and efficiency. Collaborate with data scientists, analysts, and other stakeholders to support their data needs. Drive data-driven decision-making across the organization. Ensure data quality and integrity through robust data validation and monitoring processes. Requirements 3 years of experience in data engineering Strong background in designing, building, and maintaining scalable data pipelines and systems Proficiency in SQL Proficiency in Python Experience with big data technologies such as Hadoop, Spark, and Kafka
Posted 1 month ago
7.0 - 10.0 years
25 - 35 Lacs
Hyderabad
Work from Office
Job Summary As a Senior Data Engineer, you will play a key role in developing and maintaining the databases and scripts that power Creditsafes products and websites. You will be responsible for handling large datasets, designing scalable data pipelines, and ensuring seamless data processing across cloud environments. This role provides an excellent opportunity to contribute to an exciting, fast paced, and rapidly expanding organization. Key Responsibilities Develop and maintain scalable, metadata-driven, event-based distributed data processing platforms. Design and implement data solutions using Python, Airflow, Redshift, DynamoDB, AWS Glue, and S3. Build and optimize APIs to securely handle over 1,000 transactions per second using serverless technologies. Participate in peer reviews and contribute to a clean, efficient, and high-performance codebase. Implement best practices such as continuous integration, test-driven development, and cloud optimization. Understand company and domain data to suggest improvements in existing products. Provide mentorship and technical leadership to the engineering team. Skills & Qualifications Proficiency in Python and experience in building scalable data pipelines. Experience working in cloud environments such as AWS (S3, Glue, Redshift, DynamoDB). Strong understanding of data architecture, database design, and event-driven data processing. Ability to write clean, efficient, and maintainable code. Excellent communication skills and ability to collaborate within a team. Experience in mentoring engineers and providing leadership on complex technology issues. Benefits Competitive salary and performance bonus scheme. Hybrid working model for better work-life balance. 20 days annual leave plus 10 bank holidays. Healthcare, company pension, gratuity, and parental insurance. Cab services for women for enhanced safety. Global company gatherings and career growth opportunities.
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
Job Summary We are seeking a skilled and motivated AWS Glue Data Engineer to join our data engineering team The ideal candidate will have handson experience designing and implementing scalable ETL pipelines using AWS Glue and a strong understanding of cloudbased data architecture You will play a key role in transforming raw data into actionable insights that drive business decisions Key Responsibilities Design develop and maintain ETL pipelines using AWS Databricks/Glue PySpark and other AWS services Collaborate with data scientists analysts and business stakeholders to understand data requirements Optimize data workflows for performance scalability and costefficiency Implement data quality checks validation and monitoring processes Integrate data from various sources including S3 RDS Redshift and external APIs Ensure data security and compliance with organizational and regulatory standards Document technical solutions and maintain data engineering best practices Required Qualifications Bachelors or Masters degree in Computer Science Engineering or a related field 3 years of experience in data engineering with at least 12 years using AWS Glue Proficiency in Python PySpark and SQL Strong understanding of AWS services such as S3 Lambda Redshift Athena and IAM Experience with data modeling data warehousing and big data technologies Familiarity with CICD pipelines and version control eg Git Preferred Qualifications AWS Certified Data Analytics or Solutions Architect certification Experience with orchestration tools like Apache Airflow or AWS Step Functions Knowledge of data governance and metadata management tools Exposure to DevOps practices and infrastructureascode eg CloudFormation Terraform What We Offer Competitive salary and benefits Flexible work environment Opportunities for professional growth and certification Collaborative and inclusive team culture
Posted 1 month ago
4.0 - 9.0 years
6 - 11 Lacs
Mumbai
Work from Office
Are you curious, excited by experimentation and always looking to innovate? Do you want to work in embedded payments where you can keep learning and developing whilst getting hands-on experience? Do you want to have the opportunity to play an important role in a rapidly growing and exciting Fintech business? If so, we would love to connect and collaborate! We want to hire ambitious, and value-adding talent into Modulr, one of the fastest growing payments businesses in the UK and Europe. Modulr is experiencing significant growth, and in this role you will work in a cross-functional team who are asked to solve a problem, rather than be handed a task to do. This is an excellent opportunity to work in a high-growth environment with a fast-paced and collaborative culture where you will have great opportunities to work on challenging problems. About us At Modulr, our vision is a world where all businesses are powered by embedded payments. Modulr enables businesses, from SMEs to Enterprise, initially across the UK and Europe to efficiently pay-in, collect and disburse funds instantly via a range of payment schemes, accounts, and card products. We have created an industry-leading API platform with comprehensive online tools and access, to meet the demands of daily business payments. We have two routes to market. Our Core Business Payments product allows customers in any sector to connect to us and our expanding network of accounting and payroll platforms, including Sage, Xero, BrightPay and IRIS to automate payments. Our Vertical Solutions targets a growing range of industry verticals which directly connect their IT platforms to our APIs and webhooks. We solve complex payment problems for hundreds of clients in a range of industries, including Travel, Lending, Wage Advance, and Investment & Wealth. We are deeply integrated into the payment eco-system. In the UK, we are direct participants of Faster Payments and Bacs. Modulr hold settlement accounts at the Bank of England. Our payment network connectivity includes CHAPS, Open Banking, SEPA, SWIFT and account issuance in multiple currencies. We are principal issuing members of Visa and Mastercard schemes across UK and Europe. Our regulatory permissions and governance structure are the foundations of our business. We are regulated and supervised as an Authorised Electronic Money Institution (AEMI) in the UK by the Financial Conduct Authority and in the Netherlands by De Nederlandsche Bank. Our founding team has a wealth of experience in the payments industry and growing successful businesses. Modulr is backed by the venture arms of payments giants PayPal and FIS , as well as growth investors Blenheim Chalcot , General Atlantic , Frog Capital and Highland Europe . Modulr now has over 400 employees spread globally across offices in London, Edinburgh, Amsterdam, and Mumbai. Modulr values Building the extraordinary; going that extra mile. Owning the opportunity; be passionate and proud of the time you invest. Move at pace; reach goals faster whilst supported on your career journey. Achieve it together, working collaboratively and being a Modulite. The team The Data team at Modulr is a dynamic and innovative group that is responsible for managing Modulr s data warehouse, reporting, analytics, and data science capabilities. This role will report to the Principal Data Engineer and will work closely with Product Managers, Business Stakeholders and other cross-functional teams. You will have the opportunity to mentor other data team members and users, and contribute to the growth and development of the team. Summary The Data Engineer is a vital role within Modulr and this role will support the continuous improvement and innovation of our data platform, ensuring processes are robust, efficient and scalable. Specific duties Extract and integrate data from various sources, including APIs, internal databases, and third-party platforms. Design and build efficient analytical data models using dimensional modelling methodologies and best practices. Build and maintain semantic models to enable self service access to data for internal users. Write clean, maintainable, and well-documented code following best practices. Write and execute tests and data quality checks. Collaborate with cross-functional teams to understand data requirements and develop scalable and effective solutions. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery and re-designing infrastructure for greater scalability. About You A successful Data Engineer will have a track-record in delivering results in a fast-moving business and hence be comfortable with change and uncertainty. Excellent stakeholder management experience is essential to being successful in this role. 4+ years of experience in a Data Engineering role Extensive experience with Python and SQL Excellent knowledge of data architecture, data modelling and ETL methodologies. Experience using ETL and data orchestration tools Experience integrating with BI tools and building semantic models Proven experience in supporting and working with cross-functional teams in a dynamic environment. Comfortable working in fast-paced, agile environments with a focus on getting things done efficiently. Understanding of agile methodologies and practices. Nice to Have Experience using Snowflake Experience using DBT (or similar) Experience using semantic layer tools (Cube.dev, AtScale or similar) Experience using PowerBI, Streamlit Experience using Data Quality tools (Soda, Great Expectations or similar) Building and using CI/CD pipelines Understanding of AI/ML and GenAI frameworks Experience with AWS (if not, then other cloud platforms) ModInclusion At Modulr, we are working hard to build a more positive diverse and inclusive culture that helps everyone to feel they belong and can truly bring their whole self to work. Not only is it the right thing to do for everyone in the Modulr team, it s also the right thing to do for our business, the community we operate in and attracting future talent. As part of our approach, we actively welcome applications from candidates with diverse backgrounds. By submitting your CV you understand that we have a legitimate interest to use your personal data for the purposes of assessing your eligibility for this role. This means that we may use your personal data to contact you to discuss your CV or arrange an interview, or transfer your CV to the hiring manager(s) of the role you have applied for. You can ask us at any time to remove your CV from our database by emailing peopleops@modulrfinance.com - but please note that this means we will no longer consider you for the role you have applied for.
Posted 1 month ago
10.0 - 15.0 years
35 - 40 Lacs
Bengaluru
Work from Office
Title: Senior Adobe Solution Architect Date: 1 Jul 2025 Location: Bangalore, KA, IN Job Description We are a technology-led healthcare solutions provider. We are driven by our purpose to enable healthcare organizations to be future-ready. We offer accelerated, global growth opportunities for talent that s bold, industrious, and nimble. With Indegene, you gain a unique career experience that celebrates entrepreneurship and is guided by passion, innovation, collaboration, and empathy. To explore exciting opportunities at the convergence of healthcare and technology, check out www.careers.indegene.com Looking to jump-start your career We understand how important the first few years of your career are, which create the foundation of your entire professional journey. At Indegene, we promise you a differentiated career experience. You will not only work at the exciting intersection of healthcare and technology but also will be mentored by some of the most brilliant minds in the industry. We are offering a global fast-track career where you can grow along with Indegene s high-speed growth. We are purpose-driven. We enable healthcare organizations to be future ready and our customer obsession is our driving force. We ensure that our customers achieve what they truly want. We are bold in our actions, nimble in our decision-making, and industrious in the way we work. If this excites you, then apply below. Role: Senior Adobe Solution Architect Description: Key Responsibilities: Serve as a senior Adobe technology expert, architecting robust, scalable Adobe Experience Cloud solutions tailored to the pharma and life sciences industry. Design and deliver end-to-end digital solutions across Adobe Experience Manager (AEM), Adobe Analytics, Adobe Target, and Campaign. Translate business needs into technical architectures and solution designs in compliance with healthcare industry standards and regulatory requirements. Collaborate with product, delivery, and client engagement teams to ensure successful implementation and integration of Adobe solutions. Conduct technical workshops and stakeholder sessions to define solution strategies, integration requirements, and data architecture. Lead RFP solutioning and estimation efforts, working closely with sales, pre-sales, and marketing teams to support new business development. Mentor and coach junior architects and technical leads, promoting technical excellence and innovation. Ensure adherence to best practices, governance, and security protocols across Adobe solution implementations. Stay current with Adobe product updates, releases, and industry trends to advise clients on future-ready strategies. Drive innovation by developing reusable assets, accelerators, and frameworks that enhance delivery efficiency. Must Have 10+ years of hands-on experience with Adobe Experience Cloud solutions, including deep expertise in AEM Sites and Assets, Adobe Analytics, and Target. Proven track record in architecting complex digital solutions and leading Adobe platform implementations, preferably in pharma/life sciences. Strong understanding of martech ecosystems, data integration, and personalization strategies. Adept at stakeholder management with experience working with C-level executives and cross-functional teams. Experience with Agile, DevOps, and CI/CD best practices. Excellent communication, presentation, and problem-solving skills. Adobe Certified Expert (e.g., AEM Architect, Adobe Analytics Business Practitioner) preferred. PMP, ITIL, or Agile certifications are a plus. Good to have EQUAL OPPORTUNITY
Posted 1 month ago
11.0 - 17.0 years
35 - 40 Lacs
Chennai
Work from Office
Senior Management | Full-Time | Supply Chain and Logistics Are you a visionary leader ready to transform the future of renewable energy with data and AI? At Vestas, we re looking for a Vice President of AI and Data Products to accelerate execution of our Data & AI strategy, and deliver impact on a global scale. In this role, you will collaborate closely with senior leadership and digital experts to rethink how we leverage data and AI, positioning Vestas at the forefront of digital transformation in the energy sector. If this sounds interesting, we d like to hear from you. Digital Solutions & Development > Data Domain & AI At Vestas, the Data & AI unit is crucial to our digital transformation, cultivating innovation and operational Excellence by leveraging data and artificial intelligence. This essential team plays a vital role in enabling strategic decision-making, optimising business processes, and delivering measurable value throughout the organisation. With primary hubs in Aarhus and Copenhagen (Denmark) and Chennai (India), the unit consists of 75 skilled professionals, including product owners, chapter leads, business analysts, data engineers, information architects, and Scrum masters. Together, they form a collaborative ecosystem that bridges technology and business to unlock new opportunities. In your role as Vice President of AI and Data Products, you will oversee this forward-thinking organisation, transforming the landscape of data and AI at Vestas. Alongside your leadership team, you will foster a culture of collaboration, accountability, simplicity, and passion, empowering the team together with our full digital capacity to reach ambitious goals and accelerate our journey toward autonomous products and operations. Responsibilities Oversee the development and operations of Data & AI products to rethink Vestas business and drive automation and profitability. Ensure quality and scalability across our data and AI practices within digital transformation initiatives. Partner with senior business leaders to identify high-impact opportunities and deliver AI-driven solutions that create tangible business value. Act as a visible leader and role model for Data & AI across Vestas, influencing stakeholders and driving adoption. Launch and scale reusable, reliable data products built on a modular architecture and governed by enterprise business objects. Deliver on Vestas AI Big Bets multi-year AI transformations that create competitive advantage. Build and lead an effective, cross-border team, optimising productivity, quality, and speed to meet growing demands. Establish effective partnerships and networks to accelerate value and impact. Competencies Strategic thinker with solid business acumen and the ability to translate complex data challenges into business opportunities. Deep knowledge of data architecture, governance, and AI/ML technologies. Focused on promoting creativity, prioritising continuous advancement, and generating valuable contributions. Extensive experience leading data and AI functions within large scale digital transformation programs, with a strong track record of delivering enterprise grade data products and AI solutions. Proficient in managing relationships with stakeholders and effectively communicating across all levels. Proven experience in building and scaling data and AI teams within global, matrix driven organizations, while managing large, diverse teams across multiple geographies. Fluent in English; additional languages are highly beneficial Additional information The work location for this position is Chennai, India or Copenhagen or Aarhus, Denmark. Applications are handled on an ongoing basis. Please apply online with your letter of motivation and CV as soon as possible, but no later than 1st August 2025. For any additional information, please reach out to Vips Patel, Vippa@vestas.com BEWARE - RECRUITMENT FRAUD It has come to our attention that there are a number of fraudulent emails from people pretending to work for Vestas. Read more via this link, https: / / www.vestas.com / en / careers / our-recruitment-process DEIB Statement At Vestas, we recognise the value of diversity, equity, and inclusion in driving innovation and success. We strongly encourage individuals from all backgrounds to apply, particularly those who may hesitate due to their identity or feel they do not meet every criterion. As our CEO states, "Expertise and talent come in many forms, and a diverse workforce enhances our ability to think differently and solve the complex challenges of our industry". Your unique perspective is what will help us powering the solution for a sustainable, green energy future. About Vestas Across the globe, we have installed more wind power than anyone else. We consider ourselves pioneers within the industry, as we continuously aim to design new solutions and technologies to create a more sustainable future for all of us. With more than 185 GW of wind power installed worldwide and 40+ years of experience in wind energy, we have an unmatched track record demonstrating our expertise within the field. With 30,000 employees globally, we are a diverse team united by a common goal: to power the solution - today, tomorrow, and far into the future. Vestas promotes a diverse workforce which embraces all social identities and is free of any discrimination. We commit to create and sustain an environment that acknowledges and harvests different experiences, skills, and perspectives.
Posted 1 month ago
15.0 - 20.0 years
45 - 55 Lacs
Hyderabad
Work from Office
Silicon Labs (NASDAQ: SLAB) is the leading innovator in low-power wireless connectivity, building embedded technology that connects devices and improves lives. Merging cutting-edge technology into the world s most highly integrated SoCs, Silicon Labs provides device makers the solutions, support, and ecosystems needed to create advanced edge connectivity applications. Headquartered in Austin, Texas, Silicon Labs has operations in over 16 countries and is the trusted partner for innovative solutions in the smart home, industrial IoT, and smart cities markets. Learn more at www.silabs.com . What we are looking for: Director of Data Analytics - IT at Silicon Labs, who will lead the strategy, governance, and execution of data and analytics initiatives across the enterprise. Reporting to the Chief Information Officer, they will oversee a global team, deliver trusted insights, and support both strategic and day-to-day reporting needs for Sales, Operations, Finance, and other key functions. Their leadership will ensure data quality, integrity, and enable data-driven decision-making at all levels of the organization. The expectation from this role is to come up with the Data strategy for AI readiness with the ROI in mind. Meet The Team: Our global Data Analytics team is a high-impact group embedded within the IT organization at Silicon Labs. Spanning locations across North America, Europe, and Asia, the team partners cross-functionally with Sales, Marketing, Operations, Finance, Engineering, and HR to deliver trusted, actionable insights. From real-time dashboards to long-term data governance, we enable data-informed decisions that drive business success. We re a collaborative, technically skilled team that thrives on solving complex data challenges using tools like Azure, SQL, Tableau, and Python while fostering a culture of mentorship, innovation, and continuous improvement. Key Responsibilities: Lead & Shape Strategy Define and execute the global data analytics strategy for IT, driving alignment with Silicon Labs overall business goals. Build, mentor, and grow a high-performing team of data analysts, data engineers, and BI developers. Promote a strong data-driven culture that empowers teams across IT and the broader organization. Drive Data Architecture & Governance Oversee the design and management of scalable, secure data architectures and pipelines both cloud-based and on-premises. Implement and enforce best practices for data governance, privacy, and quality to ensure trusted, compliant data. Own and optimize enterprise data warehousing, reporting platforms, and analytics tools. Deliver Business Intelligence & Advanced Analytics Lead the delivery of comprehensive BI solutions, including operational, financial, and executive dashboards. Enable cutting-edge analytics initiatives such as predictive modeling, machine learning, and self-service analytics. Collaborate with business partners to transform strategic objectives into clear, actionable insights. Manage Programs & Ensure Impact Drive multiple analytics projects simultaneously, ensuring on-time delivery with measurable outcomes. Work cross-functionally to prioritize analytics use cases based on business value and ROI. Ensure analytics platforms and solutions are highly available, scalable, and performant. Skills you will need: Proven experience with cloud data platforms such as Azure, AWS, or GCP, including expertise in data lakes and enterprise data warehouses. Experience in building the data lake house platform with technologies like Data Bricks or snowflake etc. Hands on experience in guiding the team in coming up with the architecture that prepares data for future with Model Strong proficiency in tools and technologies including Power BI, Tableau, Snowflake, SQL, Python, and modern ETL frameworks. Demonstrated success leading data transformation programs in mid- to large-scale, global organizations. Ability to approach data and systems from an enterprise architecture perspective, ensuring alignment with overall business strategy. Experience in semiconductor, IoT, or high-tech industries is highly desirable. Education and/or Experience: 15+ years of experience in IT, with at least 8 years in a leadership role focused on data analytics or business intelligence. Bachelor s or Master s degree in Computer Science, Information Systems, Data Science, or a related field. MBA or equivalent business training is a plus. Benefits & Perks: Not only will you be joining a highly skilled and tight-knit team where every engineer makes a significant impact on the product; we also strive for excellent work/life balance and to make our environment welcoming and fun. Equity Rewards (Restricted Stock Units)) Employee Stock Purchase Plan (ESPP) Insurance plans with Outpatient cover National Pension Scheme (NPS) Flexible work policy Childcare support Silicon Labs is an equal opportunity employer and values the diversity of our employees. Employment decisions are made on the basis of qualifications and job-related criteria without regard to race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status, or any other characteristic protected by applicable law.
Posted 1 month ago
13.0 - 15.0 years
20 - 25 Lacs
Bengaluru
Work from Office
[{"Salary":null , "Remote_Job":false , "Posting_Title":"Data Architect" , "Is_Locked":false , "City":"Bangalore" , "Industry":"Technology" , "Job_Description":" KeyResponsibilities: Design and architect end-to-end data solutions usingMicrosoft Fabric, Azure Data Factory, Azure Synapse Analytics, and other Azuredata services Develop comprehensive data architecture blueprints,including logical and physical data models Create data integration patterns and establish bestpractices for data ingestion, transformation, and consumption Design data lake and lakehousearchitectures optimized for performance, cost, and governance Lead implementation of Microsoft Fabric solutions includingData Factory, Data Activator, Power BI, and Real-Time Analytics Design and implement medallion architecture (Bronze,Silver, Gold layers) within Fabric Optimize OneLake storage and data organization strategies Configure and manage Fabricworkspaces, capacity, and security models Architect complex ETL/ELT pipelines using Azure DataFactory and Fabric Data Factory Design real-time and batch data processing solutions Implement data quality frameworks andmonitoring solutions RequiredQualifications: Overall, 13-15 years of experience; 5+ years ofexperience in data architecture and analytics solutions Hands-on experience with MicrosoftFabric, Expert-level proficiency in Azure data services(Azure Data Factory, Synapse Analytics, Azure SQL Database, Cosmos DB) Strong experience with Power BI development andadministration Proficiency in SQL, Python, and/or Scala for dataprocessing Experience with Delta Lake and Apache Spark Proficiency in data cataloging tools and techniques Experience in data governance using Purview or UnityCatalog like tools Expertise in Azure DataBricks in conjunction with AzureData Factory and Synapse Implementation and optimization using Medallionarchitecture Experience with EventHub and IoT data (streaming) Strong understanding of Azure cloud architecture andservices Knowledge of Git, Azure DevOps, andCI/CD pipelines for data solutions Understanding of containerization andorchestration technologies Hands-on experience with Fabric Data Factory pipelines Experience with Fabric Data Activator for real-timemonitoring Knowledge of Fabric Real-Time Analytics (KQL databases) Understanding of Fabric capacity management andoptimization Experience with OneLake and Fabric
Posted 1 month ago
6.0 - 12.0 years
20 - 27 Lacs
Bengaluru
Work from Office
The Opportunity As a Program Manager, youll orchestrate the seamless integration of data strategies into our projects, ensuring every initiative is powered by business values and insights. Your role will involve aligning cross-functional teams, streamlining processes, and driving innovation through data-driven approaches. Join us in revolutionizing how we leverage data to drive success, and be instrumental in shaping our organizations future. Seize the chance to be the architect of change and propel your career to new heights as a Program Manager. We are a global data team of innovators, united by our dedication to engineering excellence and our passion for crafting impactful solutions to solve problems. Our mission is to empower organizations to become data driven industry, driving positive change. About the Team We are seeking a highly capable and business-savvy Technical Program Manager to lead and scale our enterprise BI initiatives. This role combines deep technical expertise with strategic program management and cross-functional leadership. You will oversee a team of BI developers and analysts, manage data initiatives across 4-5 business functions, and ensure the delivery of impactful, scalable analytics solutions that drive business performance. Your Role Program & Project Leadership Lead the planning, execution, and delivery of BI programs across multiple business domains (e. g. , Finance, Sales, Operations, Marketing). Manage a team of 8-10 BI developers, analysts, and data engineers. Define program roadmaps, set priorities, and ensure timely delivery of high-quality solutions. Technical Oversight Guide the design and implementation of data models, ETL pipelines, and BI dashboards. Ensure adherence to best practices in data architecture, governance, and security. Evaluate and implement BI tools and platforms to support evolving business needs. Business Engagement Act as the primary liaison between technical teams and business stakeholders. Translate business goals into technical requirements and data strategies. Facilitate workshops, demos, and reviews to ensure alignment and adoption. Team Development & Collaboration Mentor and develop team members, fostering a culture of innovation and continuous improvement. Promote collaboration across engineering, product, and business teams. Champion data literacy and self-service analytics across the organization. What You Will Bring Bachelor s or master s degree in computer science, Information Systems, Data Analytics, or a related field. 10+ years of experience in BI, data analytics, or data engineering, with 3+ years in a technical program or team leadership role. Strong experience with BI tools (e. g. , Power BI, Tableau, )SQL, and cloud data platforms. Proven ability to manage cross-functional programs and lead technical teams. Excellent communication, stakeholder management, and problem-solving skills. Preferred Attributes: Experience in enterprise or matrixed environments. Familiarity with Agile methodologies and tools (e. g. , Jira, Confluence). Work Arrangement Hybrid: This role operates in a hybrid capacity, blending the benefits of remote work with the advantages of in-person collaboration. For most roles, that will mean coming into an office a minimum of 3 days per week, however certain roles and/or teams may require more frequent in-office presence. Additional team-specific guidance and norms will be provided by your manager. -- Nutanix is an Equal Employment Opportunity and (in the U. S. ) an Affirmative Action employer. Qualified applicants are considered for employment opportunities without regard to race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, protected veteran status, disability status or any other category protected by applicable law. We hire and promote individuals solely on the basis of qualifications for the job to be filled. We strive to foster an inclusive working environment that enables all our Nutants to be themselves and to do great work in a safe and welcoming environment, free of unlawful discrimination, intimidation or harassment. As part of this commitment, we will ensure that persons with disabilities are provided reasonable accommodations. If you need a reasonable accommodation, please let us know by contacting [email protected] .
Posted 1 month ago
9.0 - 14.0 years
40 - 75 Lacs
Bengaluru, Delhi / NCR, Mumbai (All Areas)
Work from Office
Design and implement a data architecture that supports the organizations business goals and objectives Develop data models, define data standards and guidelines, and establish processes for data integration, migration, and management Create and maintain data dictionaries, which are a comprehensive set of data definitions and metadata that provide context and understanding of the organizations data assets Ensure that the data is accurate, consistent, and reliable across the organization. This includes establishing data quality metrics and monitoring data quality on an ongoing basis Work closely with other IT professionals, including database administrators, data analysts, and developers, to ensure that the organizations data architecture is integrated and aligned with other IT systems and applications Stay up to date with new technologies and trends in data management and architecture and evaluate their potential impact on the organization’s data architecture Communicate with stakeholders across the organization. You Bring (Experience & Qualifications) A BTech / MTech degree in Computer Science or a related field At least 7+ years of experience in working on data architecture Expertise in data modeling and design, including conceptual, logical, and physical data models, and must be able to translate business requirements into data models Proficient in a variety of data management technologies, including relational databases, NoSQL databases, data warehouses, and data lakes Expertise in ETL processes, including data extraction, transformation, and loading, and must be able to design and implement data integration processes Experience with data analysis and reporting tools and techniques and must be able to design and implement data analysis and reporting processes Familiar with industry-standard data architecture frameworks, such as TOGAF or Zachman, and must be able to apply them to the organization's data architecture Familiar with cloud computing technologies, including public and private clouds, and must be able to design and implement data architectures that leverage cloud computing Certificates in Database Management will be preferred Please
Posted 1 month ago
10.0 - 15.0 years
5 - 15 Lacs
Bengaluru
Work from Office
About Tredence: Tredence is a global data science solutions provider founded in 2013 by Shub Bhowmick, Sumit Mehra, and Shashank Dubey focused on solving the last-mile problem in AI. Headquartered in San Jose, California, the company embraces a vertical-first approach and an outcome-driven mindset to help clients win and accelerate value realization from their analytics investments. The aim is to bridge the gap between insight delivery and value realization by providing customers with a differentiated approach to data and analytics through tailor-made solutions. Tredence is 3500-plus employees strong with offices in San Jose, Foster City, Chicago, London, Toronto, and Bangalore, with the largest companies in retail, CPG, hi-tech, telecom, healthcare, travel, and industrials as clients. Please find below the job description. About the Role We are looking for an experienced and visionary Generative AI Architect with 10-15 years of experience in AI/ML, including hands-on work with LLMs (Large Language Models) and Generative AI solutions . In this strategic technical leadership role, you will be responsible for designing and overseeing the development of advanced GenAI platforms and solutions that transform business operations and customer experiences. As the GenAI Architect, you will work closely with data scientists, ML engineers, product teams, and stakeholders to conceptualize, prototype, and scale generative AI use cases across the organization or client engagements. Key Responsibilities GenAI Solution Architecture & Design Lead the design and development of scalable GenAI solutions leveraging LLMs , diffusion models , and multimodal architectures . Architect end-to-end pipelines involving prompt engineering , vector databases , retrieval-augmented generation (RAG) , and LLM fine-tuning . Select and integrate foundational models (e.g., GPT, Claude, LLaMA, Mistral ) based on business needs and technical constraints. Technical Strategy & Leadership Define GenAI architecture blueprints, best practices, and reusable components for rapid development and experimentation. Guide teams on model evaluation, inference optimization , and cost-effective scaling strategies . Stay current on the rapidly evolving GenAI landscape and assess emerging tools, APIs, and frameworks. Collaboration & Delivery Work with product owners, business leaders, and data teams to identify high-impact GenAI use cases across domains like customer support, content generation, document understanding, and code generation. Support PoCs, pilots, and production deployments of GenAI models in secure, compliant environments. Collaborate with MLOps and cloud teams to enable continuous delivery, monitoring, and governance of GenAI systems. Required Qualifications & Experience Education: Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Data Science, or related technical field . PhD is a plus. Experience: 12-15 years in AI/ML and software engineering, with 3+ years focused on Generative AI and LLM-based architectures . Core Skills Deep expertise in machine learning , natural language processing (NLP) , and deep learning architectures . Hands-on experience with LLMs, transformers, fine-tuning techniques (LoRA, PEFT) , and prompt engineering . Proficient in Python , with libraries/frameworks such as Hugging Face Transformers, LangChain, OpenAI API, PyTorch, TensorFlow . Experience with vector databases (e.g., Pinecone, FAISS, Weaviate) and RAG pipelines . Strong understanding of cloud-native AI architectures (AWS/GCP/Azure), containerization (Docker/Kubernetes), and API integration. Architectural & Leadership Skills Proven ability to design and deliver scalable, secure, and efficient GenAI systems . Strong communication skills for cross-functional collaboration and stakeholder engagement. Ability to mentor engineering teams and drive innovation across the AI/ML ecosystem. Nice-to-Have Experience with multimodal models (text + image/audio/video). Knowledge of AI governance, ethical AI, and compliance frameworks . Familiarity with MLOps practices for GenAI , including model versioning, drift detection, and performance monitoring. Required Skills Generative AI, LLM, GenAI
Posted 1 month ago
8.0 - 13.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Primary Skills Azure Synapse Pipelines, Azure Enterprise Data Warehouse (EDW), Microsoft Fabric, and Power BI Good to have skills Experience with Azure DevOps, GitHub, or other CI/CD pipelines. Summary: We are seeking a highly skilled Data Engineering Lead to drive the design, development, and delivery of enterprise-grade data and analytics solutions using Azure Synapse Pipelines, Azure Enterprise Data Warehouse (EDW), Microsoft Fabric, and Power BI. This role involves leading a team of data engineers and analysts, working closely with stakeholders, and architecting scalable solutions across modern Azure data platforms. Key Responsibilities: Lead end-to-end data engineering efforts including design, development, deployment, and optimization of data pipelines and warehouse solutions on Azure. Architect and manage scalable Azure Synapse Pipelines (SQL and Apache Spark) for ingesting, transforming, and loading large volumes of structured and semi-structured data. Oversee Azure EDW (Dedicated SQL Pools) design, data modeling, and performance tuning. Implement and scale Microsoft Fabric workloads (Lakehouse, Warehouse, Dataflows Gen2, OneLake, Notebooks, Pipelines). Drive Power BI semantic model design, DAX development, and dashboard/reporting best practices across the organization. Collaborate with business and technical teams to understand data requirements and ensure solutions are aligned to enterprise goals. Manage data governance, metadata, quality, and security in compliance with organizational and regulatory standards. Provide technical mentorship and guidance to data engineers and BI developers. Establish DevOps/CI-CD practices for version control, deployment, and monitoring. Stay up to date with new Azure/Fabric features and recommend improvements. Required Skills & Experience: 8+ years of experience in data engineering and business intelligence. Strong hands-on expertise in: Azure Synapse Analytics (SQL Pools, Spark Pools, Pipelines) Azure Data Lake (Gen2) and Azure EDW Microsoft Fabric (OneLake, Lakehouse, Warehouse, Pipelines, Dataflows Gen2, Notebooks) Power BI (DAX, Power Query, Tabular Models, Gateways) Proficiency in SQL, T-SQL, and Apache Spark (PySpark or Scala). Deep understanding of data warehousing concepts, dimensional modeling, and data Lakehouse architectures. Strong experience with performance tuning and enterprise-scale data architecture. Preferred Skills: Experience with Azure DevOps, GitHub, or other CI/CD pipelines. Knowledge of data governance frameworks and tools like Microsoft Purview. Experience in Azure Monitor, Log Analytics, and other monitoring tools.
Posted 1 month ago
3.0 - 6.0 years
5 - 8 Lacs
Mumbai, Nagpur, Thane
Work from Office
Role Brokerage is a leading global financial services firm providing a wide range of investment banking, securities, investment management and wealth management services. We advise, originate, trade, manage and distribute capital for governments, institutions and individuals. As a market leader, the talent and passion of our people is critical to our success. Together, we share a common set of values rooted in integrity, excellence and strong team ethic. We provide you a superior foundation for building a professional career where you can learn, achieve and grow. Technology is the key differentiator that ensures that we manage our global businesses and serve clients on a market-leading platform that is resilient, safe, efficient, smart, fast and flexible. Technology redefines how we do business in global, complex and dynamic financial markets. We have a large number of award winning technology platforms that help to propel our Firms businesses to be the top in the market. We have built strong techno-functional teams which partner with our offices globally taking global ownership of systems and products. We have a vibrant and diverse mix of technologists working on different technologies and functional domains. There is a large focus on innovation, inclusion, giving back to the community and sharing knowledge. Data Center of Excellence (COE) is a group within the Cyber Data Risk & Resilience Division that focuses on data as a key priority of Brokerages overall Strategy. Data CoE develops common principles for ownership, distribution and consumption of data, tooling and standards for data accessibility, a framework for governing data and help address data architecture and data quality issues for new and existing initiatives at the firm by collaborating heavily with various business units and technology functions in the firm. We are looking for an experienced Front End developer to join the Data CoE Tooling fleet as we expand and pursue a rapid delivery driven by Firmwide and Regulatory initiatives. The candidate will be expected to work at a senior level within an Agile squad, planning and implementing changes in our developing set of UI projects implemented predominantly in Angular. The developer will be expected to deliver at all stages of the software development lifecycle; gathering requirements, offering best-practice solutions to rapidly evolving goals and working closely with other fleet members to ensure deliverables are produced to time and to the highest standard. Responsibilities The successful candidate will be a highly motivated team player and a confident self-starter, with development acumen towards solving engineering problems. Key responsibilities of this role are: Developing new components and services in Angular, RxJS, Ag-grid and Material; integrating with new server-side microservices and, where required, advising on or implementing server changes Performing code reviews and guidance for other developers in the fleet; guiding other UI developers in industry best practices Building automated unit and end-to-end tests for new and existing features Actively participating in code reviews and Agile ceremonies Creating prototypes and wireframes for new features in conjunction with business users and stakeholders Required Skills Strong expertise with demonstratable work history of designing and developing modern web applications in Angular Expert level JavaScript/TypeScript knowledge in a cross-browser environment Strong expertise with reactive web development using RxJS Knowledge of Ag-Grid Enterprise features and styling/testing concerns Use of component/styling libraries e.g. Material and visualization/graphing libraries; D3 Ability to create wireframes and prototypes from complex requirements in order to iterate prototype designs with stakeholders (Balsamiq/Figma) Proficiency in writing unit tests with Karma and end-to-end tests using Cypress/Cucumber Strong technical analysis and problem-solving skills Strong communicator Proficiency in Git, Bitbucket, CI/CD pipelines, build tooling Desired Skills Previous IB background Expertise in server-side development (Java/Spring frameworks) Knowledge of ngrx or similar Experience of server-side development using Node Experience with designing RESTful Web Services/microservices Creation/design of dashboards in Tableau or similar
Posted 1 month ago
5.0 - 10.0 years
50 - 100 Lacs
Bengaluru
Work from Office
. Roles and Responsibility Job Overview We are looking for a savvy Data Engineer to join our growing team of data engineers. Thehire will be responsible for expanding and optimizing our data and data pipeline architecture,as well as optimizing data flow and collection for cross functional teams. The ideal candidateis an experienced data pipeline builder and data wrangler who enjoys optimizing data systemsand building them from the ground up. The Data Engineer will support our softwaredevelopers, database architects, and data analysts on data initiatives and will ensure optimaldata delivery architecture is consistent throughout ongoing projects. They must be selfdirectedand comfortable supporting the data needs of multiple teams,systems and products.The right candidate will be excited by the prospect of optimizing or even re-designing ourcompany s data architecture to support our next generation of products and data initiatives. Responsibilities for Data Engineer Create and maintain optimal data pipeline architecture, Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and spark on Azure big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Work with data and analytics experts to strive for greater functionality in our data systems. Qualifications for Data Engineer Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing big data data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with 5+ years of experience in a Data Engineer role,having experience using the following software/tools: Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Azure SQL, CosmosDB, Couchbase. Experience with data pipeline and workflow management tools: Azure Data Factory, Synapse Pipeline. Experience with Azure cloud services: Databricks, Synapse Analytics, Azure Function, ADLS Experience with stream-processing systems: Storm, Spark-Streaming, etc. Experience with object-oriented/object function scripting languages: Python, Java, C++,Scala, etc.
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai
Work from Office
This is a remote position. Key Responsibilities: Design, implement, and optimize data architectures to support large-scale AI and machine learning systems Collaborate with cross-functional teams to define data models, APIs, and integration flows Architect secure, scalable data pipelines for structured and unstructured data Oversee data governance, access controls, and compliance (GDPR, SOC2, etc.) Select appropriate data storage technologies (SQL/NoSQL/data lakes) for various workloads Work with MLOps and DevOps teams to enable real-time data availability and model serving Evaluate and integrate third-party APIs, datasets, and connectors Contribute to system documentation and data architecture diagrams Support AI researchers with high-quality, well-structured data pipelines Requirements Required Qualifications: Bacheloror Masterdegree in Computer Science, Data Engineering, or a related field 5+ years of experience as a Data Architect, Data Engineer, or in a similar role Expertise in designing cloud-based data architectures (AWS, Azure, GCP) Strong knowledge of SQL, NoSQL, and distributed databases (PostgreSQL, MongoDB, Cassandra, etc.) Experience with big data tools like Spark, Kafka, Airflow, or similar Familiarity with data warehousing tools (Redshift, BigQuery, Snowflake) Solid understanding of data privacy, compliance, and governance best practices Preferred Qualifications: Experience working on AI/ML or Gen AI-related products Proficiency in Python or another scripting language used for data processing Exposure to building APIs for data ingestion and consumption Prior experience supporting enterprise-level SaaS products Strong analytical and communication skills Travel & Documentation Requirement: Candidate must hold a valid passport Willingness to travel overseas for 1 week (as part of client collaboration) Having a valid US visa (e.g., B1/B2, H1B, Green Card, etc.) is a strong advantage Benefits Why Join Us: Work on high-impact, cutting-edge Generative AI products Collaborate with some of the best minds in AI, engineering, and product Flexible work culture with global exposure Opportunity to work on deeply technical challenges with real-world impact
Posted 1 month ago
15.0 - 20.0 years
17 - 20 Lacs
Mumbai
Work from Office
This role requires deep understanding of data warehousing, business intelligence (BI), and data governance principles, with strong focus on the Microsoft technology stack. Data Architecture Develop and maintain the overall data architecture, including data models, data flows, data quality standards. Design and implement data warehouses, data marts, data lakes on Microsoft Azure platform Business Intelligence Design and develop complex BI reports, dashboards, and scorecards using Microsoft Power BI. Data Engineering Work with data engineers to implement ETL/ELT pipelines using Azure Data Factory. Data Governance Establish and enforce data governance policies and standards. Primary Skills Experience 15+ years of relevant experience in data warehousing, BI, and data governance. Proven track record of delivering successful data solutions on the Microsoft stack. Experience working with diverse teams and stakeholders. Required Skills and Experience Technical Skills: Strong proficiency in data warehousing concepts and methodologies. Expertise in Microsoft Power BI. Experience with Azure Data Factory, Azure Synapse Analytics, and Azure Databricks. Knowledge of SQL and scripting languages (Python, PowerShell). Strong understanding of data modeling and ETL/ELT processes. Secondary Skills Soft Skills Excellent communication and interpersonal skills. Strong analytical and problem-solving abilities. Ability to work independently and as part of a team. Strong attention to detail and organizational skills.
Posted 1 month ago
10.0 - 15.0 years
32 - 45 Lacs
Hyderabad, Gurugram, Bengaluru
Hybrid
Job Description Function: Software Engineering Big Data / DWH / ETL Azure Data Factory Azure Synapse ETL Spark SQL Scala Responsibilities: Designing and implementing scalable and efficient data architectures. Creating data models and optimizing data structures for performance and usability. Implementing and managing data lakehouses and real-time analytics solutions using Microsoft Fabric. Leveraging Fabric's OneLake, Dataflows, and Synapse Data Engineering for seamless data management. Enabling end-to-end analytics and AI-powered insights. Developing and orchestrating data pipelines in Azure Data Factory. Managing ETL/ELT processes for data integration across various sources. Optimizing data workflows for performance and cost efficiency. Designing interactive dashboards and reports in Power BI. Implementing data models, DAX calculations, and performance optimizations. Ensuring data quality, security, and governance in reporting solutions. Requirements: Data Architect with 10+ years of experience in Microsoft Fabric skills, designs, and implements data solutions using Fabric, focusing on data integration, analytics, and automation, while ensuring data quality, security, and compliance. Primary Skills (Must Have): Azure Data Pipeline, Apache Spark, ETL, Azure Factory, Azure Synapse, Azure Functions, Spark SQL, SQL. Secondary Skills (Good to Have): Other Azure Services, Python/Scala, DataStage (preferably), and Fabric.
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Adobe Experience Platform (AEP) Good to have skills : Java Enterprise EditionMinimum 12 year(s) of experience is required Educational Qualification : Min 15 years of continuous education Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Adobe Experience Platform (AEP)Good to Have Skills :Java Enterprise EditionJob :Key Responsibilities :Lead AEP based project deliveries as architect for experience transformation leveraging AEP with integrations to other platforms or legacy systems for industry specific use cases such as Retail, Banking etcSolution Design:Collaborate with stakeholders to understand business requirements and translate them into scalable and efficient Adobe Experience Platform CDP solutions Design data architecture, integration patterns, and workflows to optimize the collection, storage, and activation of cust Technical Experience :Adobe AEP expertise with more than 2 years of experience in leading AEP based implementationsExtensive experience as a technical architect or solution architect, with a focus on customer data platforms CDPOverall knowledge of capturing customer data from different data sources to aggregate and generate customer insights with analytics products Knowledge and experience on offer decisioning and marketing activations through AJO, Target etcExperience in data driven experience transformatio Professional Attributes :Good verbal written communication skills to connect with customers at varying levels of the organization Ability to operate independently and make decisions with little direct supervision c:Effective Co-ordination and Analytical skillsLeadership skills to lead a team of AEP specialists, marketing Educational Qualification:Min 15 years of continuous educationAdditional Info :NA Qualification Min 15 years of continuous education
Posted 1 month ago
8.0 - 10.0 years
6 - 10 Lacs
Chennai
Work from Office
We are seeking a skilled Data Engineer with expertise in MuleSoft to join our dynamic team In this role, you will be responsible for designing, developing, and maintaining robust data integration solutions that leverage MuleSoft's powerful capabilities You will collaborate closely with cross-functional teams to gather requirements and translate them into scalable data architectures Our ideal candidate is not only proficient in data engineering but also has a strong understanding of API-led connectivity and microservices architecture You will work on various projects that involve data extraction, transformation, and loading (ETL) processes, as well as ensuring the integrity and accessibility of data across different systems Your analytical mindset and problem-solving skills will be crucial in optimizing data flows and enhancing performance Additionally, you will be involved in the automation of data processes, implementing best practices for data management, and ensuring compliance with data governance policies By joining our team, you will have the opportunity to work with a variety of technologies, contribute to innovative projects, and grow your skills in a collaborative environment Responsibilities: Design and implement ETL processes using MuleSoft to integrate data from various sources. Collaborate with stakeholders to gather and understand data integration requirements. Develop and maintain data pipelines and workflows to ensure efficient data transfer and processing. Optimize data models for performance and scalability across different applications and environments. Monitor and troubleshoot data integration processes, addressing any issues that arise. Ensure data quality and integrity by implementing validation and cleansing procedures. Document data flows, processes, and integration designs to maintain comprehensive records. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Data Engineer with a strong focus on MuleSoft technologies. Hands-on experience with API development and integration using MuleSoft Anypoint Platform. Strong understanding of data modeling concepts and database management systems. Proficiency in programming languages such as Java, Python, or SQL. Experience with cloud services such as AWS, Azure, or Google Cloud Platform. Excellent problem-solving skills and attention to detail, with the ability to work collaboratively.
Posted 1 month ago
8.0 - 20.0 years
20 - 25 Lacs
Pune
Work from Office
We are looking for a highly skilled and visionary AI Architect to lead the design, development, and implementation of Generative AI solutions across AWS and Microsoft Azure environments. This role is pivotal in shaping our GenAI strategy through the creation of scalable, secure, and responsible AI systemsleveraging both agentic and non-agentic workflow designs. You will provide technical leadership in architecting AI-powered solutions that span across retrieval-augmented generation (RAG), vector search, foundation models, prompt engineering, and enterprise-grade AI governanceall within the AWS and Azure cloud ecosystems. Key Responsibilities : Architect and deliver generative AI solutions in AWS (Bedrock, SageMaker) and Azure (Azure OpenAI, Azure ML) environments. Lead the design of agentic workflows using frameworks such as AWS Agents for Bedrock or Azure orchestration tools. Build non-agentic AI pipelines using RAG (Retrieval-Augmented Generation) methodologies with vector databases (e.g., Amazon OpenSearch, Azure Cognitive Search). Design and implement prompt engineering and prompt management strategies for large language models (LLMs) in cloud services. Evaluate and integrate foundation models (e.g., GPT, Claude, Titan, Phi-2, Falcon, Mistral) via Amazon Bedrock or Azure OpenAI. Develop chunking and indexing strategies for unstructured data to support vector-based search and RAG workflows. Ensure strong AI governance and responsible AI practices, including security, explainability, auditability, and ethical usage in alignment with enterprise policies. Collaborate with data engineering and DevOps teams to ensure seamless data pipeline integration, model lifecycle management, and CI/CD automation. Guide the development of reference architectures, best practices, and reusable components for GenAI use cases across business units. Stay current with evolving GenAI capabilities in AWS and Azure ecosystems, providing technical thought leadership and strategic guidance. Required Qualifications: Bachelor's or Masters degree in Computer Science, Engineering, or a related field. 8+ years of experience in software/data architecture with 3+ years in AI/ML, including hands-on work with generative AI solutions. Proven experience designing and deploying AI workflows using: AWS: Amazon Bedrock, SageMaker, Lambda, DynamoDB, OpenSearch. Azure: Azure OpenAI, Azure ML, Azure Cognitive Services, Cognitive Search. Expertise in RAG pipeline architecture, prompt engineering, and vector database design. Familiarity with tools and frameworks for AI agent orchestration (e.g., LangChain, Semantic Kernel, AWS Agent Framework). Strong understanding of cloud security, identity management (IAM, RBAC), and compliance in enterprise environments. Proficiency in Python and hands-on experience with modern ML libraries and APIs used in AWS and Azure. Preferred Qualifications: Experience working with LLMOps tools in cloud environments (e.g., model monitoring, logging, performance tracking). Understanding of fine-tuning strategies, model evaluation, and safety/risk management of GenAI models. Familiarity with serverless architecture, containerization (ECS, AKS), and CI/CD practices in AWS/Azure. Ability to translate business problems into scalable AI solutions with measurable outcomes
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40352 Jobs | Dublin
Wipro
19655 Jobs | Bengaluru
Accenture in India
18055 Jobs | Dublin 2
EY
16464 Jobs | London
Uplers
11953 Jobs | Ahmedabad
Amazon
10853 Jobs | Seattle,WA
Accenture services Pvt Ltd
10424 Jobs |
Bajaj Finserv
10110 Jobs |
Oracle
9702 Jobs | Redwood City
IBM
9556 Jobs | Armonk