Jobs
Interviews

6020 Databricks Jobs - Page 28

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 years

0 Lacs

India

Remote

Work Location - Remote Work Timings - 12 pm - 9 pm Notice Period – Immediate/ who is serving the notice period. Must have excellent communication Overall experience: 12-14 years Candidates should have 6+ six years of relevant experience as a Scrum Master, specifically managing Data Analytics projects . This experience must be clearly detailed in the resume. Scrum Master – Data Analytics We’re seeking a highly motivated Scrum Master who thrives in fast-paced environments, inspires teams, and enables the delivery of impactful data and analytics solutions for manufacturing and supply chain operations. Act as Scrum Master for Agile teams delivering data and analytics solutions for manufacturing and supply chain operations. Work closely with Product Owners to align on business priorities, maintain a clear and actionable backlog, and ensure stakeholder needs are me Facilitate core Agile ceremonies: Sprint Planning, Daily Standups, Backlog Refinement, Reviews, and Retrospectives. Guide the team through data-focused sprints, including work on ingestion, transformation, integration, and reporting. Track progress, remove blockers, and drive continuous improvement in team performance and delivery. Collaborate with data engineers, analysts, architects, and business teams to ensure high-quality, end-to-end solutions. Promote Agile best practices across platforms like SAP ECC, IBP, HANA, BOBJ, Databricks, and Tableau. Monitor and share Agile metrics (e.g., velocity, burn-down) to keep teams and stakeholders aligned. Support team capacity planning, identify bottlenecks early, and help the team stay focused and accountable. Foster a culture of collaboration, adaptability, and frequent customer feedback to ensure business value is delivered in every sprint. Guide the team to continuously break down efforts to smaller components. Smaller workpieces result in better flow. Having 8 stories/tasks of ½ day each is better than having 1 story/task of 4 days. Guide the team to always provide clarity on the stories/tasks by using detailed descriptions and explicit acceptance criteria. Bring the team’s focus in the daily standup meetings to completing things instead of working on things

Posted 1 week ago

Apply

7.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Summary We are seeking a highly skilled Sr. Developer with 7 to 10 years of experience to join our dynamic team. The ideal candidate will have expertise in Python Databricks SQL Databricks Workflows and PySpark. This role operates in a hybrid work model with day shifts offering the opportunity to work on innovative projects that drive our companys success. Responsibilities Develop and maintain scalable data processing systems using Python and PySpark to enhance data analytics capabilities. Collaborate with cross-functional teams to design and implement Databricks Workflows that streamline data operations. Optimize Databricks SQL queries to improve data retrieval performance and ensure efficient data management. Provide technical expertise in Python programming to support the development of robust data solutions. Oversee the integration of data sources into Databricks environments to facilitate seamless data processing. Ensure data quality and integrity by implementing best practices in data validation and error handling. Troubleshoot and resolve complex technical issues related to Databricks and PySpark environments. Contribute to the continuous improvement of data processing frameworks and methodologies. Mentor junior developers and provide guidance on best practices in data engineering. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Conduct code reviews to ensure adherence to coding standards and best practices. Stay updated with the latest industry trends and technologies to drive innovation in data engineering. Document technical processes and workflows to support knowledge sharing and team collaboration. Qualifications Possess a strong proficiency in Python programming and its application in data engineering. Demonstrate expertise in Databricks SQL and its use in optimizing data queries. Have hands-on experience with Databricks Workflows for efficient data processing. Show proficiency in PySpark for developing scalable data solutions. Exhibit excellent problem-solving skills and the ability to troubleshoot complex technical issues. Have a solid understanding of data integration techniques and best practices. Display strong communication skills to collaborate effectively with cross-functional teams. Certifications Required Databricks Certified Data Engineer Associate Python Institute PCEP Certification

Posted 1 week ago

Apply

5.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Are you ready to write your next chapter? Make your mark at one of the biggest names in payments. With proven technology, we process the largest volume of payments in the world, driving the global economy every day. When you join Worldpay, you join a global community of experts and changemakers, working to reinvent an industry by constantly evolving how we work and making the way millions of people pay easier, every day. What makes a Worldpayer? It’s simple: Think, Act, Win. We stay curious, always asking the right questions to be better every day, finding creative solutions to simplify the complex. We’re dynamic, every Worldpayer is empowered to make the right decisions for their customers. And we’re determined, always staying open – winning and failing as one. We’re looking for a Sr AWS Databricks Admin to join our Big Data Team to help us unleash the potential of every business. Are you ready to make your mark? Then you sound like a Worldpayer. About The Team We are seeking a talented and experienced Senior AWS Data Lake Engineer to join our dynamic team who can design, develop, and maintain scalable data pipelines and manage AWS Data Lake solutions. The ideal candidate will have extensive experience in handling sensitive data, including Personally Identifiable Information (PII) and Payment Card Industry (PCI) data, using advanced tokenization and masking techniques. What You Will Be Doing Design, develop, and maintain scalable data pipelines using Python and AWS services. Implement and manage AWS Data Lake solutions, including ingestion, storage, and cataloging of structured and unstructured data. Apply data tokenization and masking techniques to protect sensitive information in compliance with data privacy regulations (e.g., GDPR, HIPAA). Collaborate with data engineers, architects, and security teams to ensure secure and efficient data flows. Optimize data workflows for performance, scalability, and cost-efficiency. Monitor and troubleshoot data pipeline issues and implement robust logging and alerting mechanisms. Document technical designs, processes, and best practices. Provide support on Databricks and Snowflake. Maintain comprehensive documentation for configurations, procedures, and troubleshooting steps. What You Bring 5+ years of experience working as a Python developer/architect. Strong proficiency in Python, with experience in data processing libraries (e.g., Pandas, PySpark). Proven experience with AWS services such as S3, Glue, Lake Formation, Lambda, Athena, and IAM. Solid understanding of data lake architecture and best practices. Experience with data tokenization, encryption, and anonymization techniques. Familiarity with data governance, compliance, and security standards. Experience with Snowflake and/or Databricks (Nice to have). Experience with CI/CD tools and version control (e.g., Git, CodePipeline). Strong problem-solving skills and attention to detail. Where you’ll own it You’ll own it in our modern Bangalore/Pune/Indore hub. With hubs in the heart of city centers and tech capitals, things move fast in APAC. We pride ourselves on being an agile and dynamic collective, collaborating with different teams and offices across the globe. Worldpay Perks - What We’ll Bring For You We know it’s bigger than just your career. It’s your life, and your world. That’s why we offer global benefits and programs to support you at every stage. Here’s a taste of what you can expect. A competitive salary and benefits. Time to support charities and give back to your community. Parental leave policy. Global recognition platform. Virgin Pulse access. Global employee assistance program. What Makes a Worldpayer At Worldpay, we take our Values seriously, and we live them every day. Think like a customer, Act like an owner, and Win as a team. Curious. Humble. Creative. We ask the right questions, listening and learning to get better every day. We simplify the complex and we’re always looking to create a bigger impact for our colleagues and customers. Empowered. Accountable. Dynamic. We stay agile, using our initiative, taking calculated risks to progress. Never standing still, never settling, we work at pace to achieve our goals. We champion our ideas and stay flexible to make them happen. We know that every action adds up. Determined. Inclusive. Open. Unlocking potential means working as one global community. Our work spans borders, and we stay united by our purpose. We collaborate, always encouraging others to perform at their best, welcoming new perspectives. Does this sound like you? Then you sound like a Worldpayer. Apply now to write the next chapter in your career. We can’t wait to hear from you. To find out more about working with us, find us on LinkedIn. Privacy Statement Worldpay is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how Worldpay protects personal information online, please see the Online Privacy Notice. Sourcing Model Recruitment at Worldpay works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. Worldpay does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company. #pridepass

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

At SiteMinder we believe the individual contributions of our employees are what drive our success. That’s why we hire and encourage diverse teams that include and respect a variety of voices, identities, backgrounds, experiences and perspectives. Our diverse and inclusive culture enables our employees to bring their unique selves to work and be proud of doing so. It’s in our differences that we will keep revolutionising the way for our customers. We are better together! What We Do… We’re people who love technology but know that hoteliers just want things to be simple. So since 2006 we’ve been constantly innovating our world-leading hotel commerce platform to help accommodation owners find and book more guests online - quickly and simply. We’ve helped everyone from boutique hotels to big chains, enabling travellers to book igloos, cabins, castles, holiday parks, campsites, pubs, resorts, Airbnbs, and everything in between. And today, we’re the world’s leading open hotel commerce platform, supporting 47,000 hotels in 150 countries - with over 125 million reservations processed by SiteMinder’s technology every year. About The Engineering Director Role… As the Engineering Director, Data Engineering , you will be responsible for establishing and leading SiteMinder’s data engineering function in Pune. You will play a critical role in setting up a world-class development team, driving best practices in data strategy, architecture and governance. Your work will ensure the scalability, availability and security of our data platform. You will build and grow new data engineering teams in Pune , hiring top talent and fostering a culture of innovation, collaboration and technical excellence. You will also work closely with global engineering, product and analytics teams to enable data-driven decision-making at scale. This role will report to VP, Data and require strong engagement with the Chief Technology Officer and Chief Data Officer What You’ll Do… Set up & scale the Engineering team in Pune Lead the recruitment, onboarding and development of a high-performing engineering team, with an initial focus on data and broadening over time to other engineering capabilities. Establish SiteMinder’s engineering presence in Pune, creating a strong technical foundation for future growth. Foster an inclusive, high-trust engineering culture that encourages learning, growth and innovation. Line manage, mentor and support your team to drive performance Promote SiteMinder’s story in Pune Be an advocate and spokesperson at local events, community groups and publications to tell the SiteMinder story and attract the right talent. Execute SiteMinder’s data strategy Work with stakeholders to execute the vision, strategy and roadmap for SiteMinder’s data platform. Lead the team implementing scalable data architectures to support business needs. Deliver high impact data initiatives. Establish best practices for data governance, security, privacy and compliance Enhance data infrastructure & operations Collaborate with architects to build and maintain a modern data platform with scalable pipelines and real-time analytics capabilities. Lead initiatives to improve data quality, reliability and observability. Optimise data storage, processing and retrieval strategies for efficiency and cost- effectiveness. Drive the adoption and optimisation of Databricks for scalable data processing and machine learning workloads. Collaborate Across Global Teams Work closely with global engineering, product and analytics teams to ensure data solutions align with business objectives. Collaborate with the Chief Technology Officer, Chief Data Officer, Principal Data Engineer(s), Chief Engineer, Software Engineers, Engineering Managers and other key engineering roles. Partner with leadership to define KPIs, data policies and governance frameworks. Advocate for a data-driven culture across SiteMinder What You Have… Extensive years of experience in data engineering, with wide experience in leadership roles. Proven track record in building and scaling development teams in India, preferably in a global organisation. Strong experience in hiring, mentoring and leading high-performing teams. Expertise in cloud-based data platforms (AWS) and modern data architectures. Strong hands-on experience with big data technologies (Spark, Kafka, Snowflake, Databricks, etc.). Experience designing and optimising large-scale data processing solutions using Databricks. Deep knowledge of data governance, security and compliance best practices. Experience leading the implementation of data pipelines, ETL frameworks and real- time streaming solutions. Strong stakeholder management and the ability to align technical solutions with business objectives. Passion for driving innovation in data engineering and empowering teams to excel. Our Perks & Benefits… Mental health and well-being initiatives Generous parental (including secondary) leave policy Flexibility to work in a Hybrid model (2-3 days in-office) Paid birthday, study and volunteering leave every year Sponsored social clubs, team events, and celebrations Employee Resource Groups (ERG) to help you connect and get involved Investment in your personal growth offering training for your advancement Does this job sound like you? If yes, we'd love for you to be part of our team! Please send a copy of your resume and our Talent Acquisition team will be in touch. When you apply, please tell us the pronouns you use and any adjustments you may need during the interview process. We encourage people from underrepresented groups to apply.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Particle41 is seeking a talented and versatile Data Engineer to join our innovative team. As a Data Engineer, you will play a key role in designing, building, and maintaining robust data pipelines and infrastructure to support our clients' data needs. You will work on end-to-end data solutions, collaborating with cross-functional teams to ensure high-quality, scalable, and efficient data delivery. This is an exciting opportunity to contribute to impactful projects, solve complex data challenges, and grow your skills in a supportive and dynamic environment. In This Role, You Will: Software Development Design, develop, and maintain scalable ETL (Extract, Transform, Load) pipelines to process large volumes of data from diverse sources. Build and optimize data storage solutions, such as data lakes and data warehouses, to ensure efficient data retrieval and processing. Integrate structured and unstructured data from various internal and external systems to create a unified view for analysis. Ensure data accuracy, consistency, and completeness through rigorous validation, cleansing, and transformation processes. Maintain comprehensive documentation for data processes, tools, and systems while promoting best practices for efficient workflows. Requirements Gathering and Analysis Collaborate with product managers, and other stakeholders to gather requirements and translate them into technical solutions. Participate in requirement analysis sessions to understand business needs and user requirements. Provide technical insights and recommendations during the requirements-gathering process. Agile Development Participate in Agile development processes, including sprint planning, daily stand-ups, and sprint reviews. Work closely with Agile teams to deliver software solutions on time and within scope. Adapt to changing priorities and requirements in a fast-paced Agile environment. Testing and Debugging Conduct thorough testing and debugging to ensure the reliability, security, and performance of applications. Write unit tests and validate the functionality of developed features and individual elements. Writing integration tests to ensure different elements within a given application function as intended and meet desired requirements. Identify and resolve software defects, code smells, and performance bottlenecks. Continuous Learning and Innovation Stay updated with the latest technologies and trends in full-stack development. Propose innovative solutions to improve the performance, security, scalability, and maintainability of applications. Continuously seek opportunities to optimize and refactor existing codebase for better efficiency. Stay up to date with cloud platforms such as AWS, Azure, or Google Cloud Platform. Collaboration Collaborate effectively with cross-functional teams, including testers, and product managers. Foster a collaborative and inclusive work environment where ideas are shared and valued. Skills and Experience We Value: Bachelor's degree in computer science, Engineering, or related field. Proven experience as a Data Engineer, with a minimum of 3 years of experience. Proficiency in Python programming language. Experience with database technologies such as SQL (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB) databases. Strong understanding of Programming Libraries/Frameworks and technologies such as Flask, API frameworks, datawarehousing/lakehouse, principles, database and ORM, data analysis databricks, panda's, Spark, Pyspark, Machine learning, OpenCV, scikit-learn. Utilities & Tools: logging, requests, subprocess, regex, pytest ELK stack, Redis, distributed task queues Strong understanding of data warehousing/lakehousing principles and concurrent/parallel processing concepts. Familiarity with at least one cloud data engineering stack (Azure, AWS, or GCP) and the ability to quickly learn and adapt to new ETL/ELT tools across various cloud providers. Familiarity with version control systems like Git and collaborative development workflows. Competence in working on Linux OS and creating shell scripts. Solid understanding of software engineering principles, design patterns, and best practices. Excellent problem-solving and analytical skills, with a keen attention to detail. Effective communication skills, both written and verbal, and the ability to collaborate in a team environment. Adaptability and willingness to learn new technologies and tools as needed. About Particle41 Our core values of Empowering, Leadership, Innovation, Teamwork, and Excellence drive everything we do to achieve the ultimate outcomes for our clients. Empowering Leadership for Innovation in Teamwork with Excellence ( ELITE ) E - Empowering: Enabling individuals to reach their full potential L - Leadership: Taking initiative and guiding each other toward success I - Innovation: Embracing creativity and new ideas to stay ahead T - Teamwork: Collaborating with empathy to achieve common goals E - Excellence: Striving for the highest quality in everything we do We seek team members who embody these values and are committed to contributing to our mission. Particle41 welcomes individuals from all backgrounds who are committed to our mission and values. We provide equal employment opportunities to all employees and applicants, ensuring that hiring and employment decisions are based on merit and qualifications without discrimination based on race, color, religion, caste, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, local, or international laws. This policy applies to all aspects of employment and hiring. We appreciate your interest and encourage applicants from these regions to apply. If you need any assistance during the application or interview process, please feel free to reach out to us at careers@Particle41.com.

Posted 1 week ago

Apply

4.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

DXFactor is a US-based tech company working with customers across the globe. We are a Great place to work with certified company. We are looking for candidates for Data Engineer (4 to 6 Yrs exp) We have our presence in: US India (Ahmedabad, Bangalore) Location : Ahmedabad Website : www.DXFactor.com Designation: Data Engineer (Expertise in SnowFlake, AWS & Python) Key Responsibilities Design, develop, and maintain scalable data pipelines for batch and streaming workflows Implement robust ETL/ELT processes to extract data from various sources and load into data warehouses Build and optimize database schemas following best practices in normalization and indexing Create and maintain documentation for data flows, pipelines, and processes Collaborate with cross-functional teams to translate business requirements into technical solutions Monitor and troubleshoot data pipelines to ensure optimal performance Implement data quality checks and validation processes Build and maintain CI/CD workflows for data engineering projects Stay current with emerging technologies and recommend improvements to existing systems Requirements Bachelor's degree in Computer Science, Information Technology, or related field Minimum 4+ years of experience in data engineering roles Strong proficiency in Python programming and SQL query writing Hands-on experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra) Experience with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery) Proven track record in building efficient and scalable data pipelines Practical knowledge of batch and streaming data processing approaches Experience implementing data validation, quality checks, and error handling mechanisms Working experience with cloud platforms, particularly AWS (S3, EMR, Glue, Lambda, Redshift) and/or Azure (Data Factory, Databricks, HDInsight) Understanding of different data architectures including data lakes, data warehouses, and data mesh Demonstrated ability to debug complex data flows and optimize underperforming pipelines Strong documentation skills and ability to communicate technical concepts effectively

Posted 1 week ago

Apply

15.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Key Responsibilities Leadership and Management: Lead the ISS distribution, Client Propositions, Sustainable Investing and Regulatory reporting data outcomes defining the data roadmap and capabilities and supporting the execution and delivery of the data solutions as a Data Product lead within the ISS Data Programme. Line management responsibilities for junior data analysts within the chapter, coaching, influencing and motivating them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead and define the data product backlog, documentation, enable peer-reviews, analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for driving efficiencies, scale and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. Coordinate with internal and external teams to communicate with those impacted by data flows. An advocate for the ISS Data Programme. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. Key Responsibilities Leadership and Management: Lead the ISS distribution, Client Propositions, Sustainable Investing and Regulatory reporting data outcomes defining the data roadmap and capabilities and supporting the execution and delivery of the data solutions as a Data Product lead within the ISS Data Programme. Line management responsibilities for junior data analysts within the chapter, coaching, influencing and motivating them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead and define the data product backlog , documentation, enable peer-reviews, analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for driving efficiencies, scale and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. Coordinate with internal and external teams to communicate with those impacted by data flows. An advocate for the ISS Data Programme. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. Your Skills and Experience Strong leadership and senior management level communication, internal and external client management and influencing skills. At least 15 years of proven experience as a senior business/technical/data analyst within technology and/or business change delivering data led business outcomes within the financial services/asset management industry. 5-10 years as a data product owner adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. Outstanding knowledge of Client life cycle covering institutional & wholesale with a focus on CRM data, Transfer agency data. Very good understanding of the data generated by investment management processes and how that is leveraged in Go-to market capabilities such as client reporting, Sales, Marketing. Excellent knowledge of regulatory environment with a focus on European regulations and ESG specific ones such as MIFID II, EMIR, SFDR.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Overview Cvent is a leading meetings, events, and hospitality technology provider with more than 4,800 employees and ~22,000 customers worldwide, including 53% of the Fortune 500. Founded in 1999, Cvent delivers a comprehensive event marketing and management platform for marketers and event professionals and offers software solutions to hotels, special event venues and destinations to help them grow their group/MICE and corporate travel business. Our technology brings millions of people together at events around the world. In short, we’re transforming the meetings and events industry through innovative technology that powers the human connection. The DNA of Cvent is our people, and our culture has an emphasis on fostering intrapreneurship - a system that encourages Cventers to think and act like individual entrepreneurs and empowers them to take action, embrace risk, and make decisions as if they had founded the company themselves. At Cvent, we value the diverse perspectives that each individual brings. Whether working with a team of colleagues or with clients, we ensure that we foster a culture that celebrates differences and builds on shared connections. In This Role, You Will Work with a talented group of data scientists, software developers and product owners to identify possible applications for AI and machine learning. Understand Cvent's product lines, their business models and the data collected. Perform machine learning research on various types of data ( numeric, text, audio, video, image). Deliver machine learning models in a state that can be picked up by a software development team and operationalized via Cvent's products. Thoroughly document your work as well as results, to the extent that another data scientist can replicate them. Progressively enhance your skills in machine learning, writing production-quality Python code, and communication. Here's What You Need A Bachelor's degree in a quantitative field (natural sciences, math, statistics, computer science). At least 3 years of experience working as a data scientist in industry. In-depth familiarity with the Linux operating system and command-line work. Conceptual and technical understanding of machine learning, including model training and evaluation. Experience with formal Python coding. Proficiency in machine learning packages in Python. Familiarity with Generative AI based system development. Experience with relational databases and query-writing in SQL. Knowledge of linear algebra and statistics. Skills in data exploration and interpretation. It Will Be Excellent If You Also Have A Master's or PhD degree in a quantitative field. Experience with Databricks and/or Snowflake platforms. Ability to write production-quality Python code with testing coverage. Experience working on a cloud platform (AWS/Azure/Google Cloud), especially machine learning R&D on a cloud platform. Knowledge of the software development lifecycle, including Git processes, code reviews, test-driven development and CI/CD. Experience with A/B testing. Skills in data visualization and dashboarding. Knowledge of how to interact with a REST API. Proven ability for proactive, independent work with some supervision. Strong verbal and written communication.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About Our Team The Content and Data Analytics team is part of DataOps, which is an integral part of Global Operations at Elsevier. We provide data analysis services, primarily using Databricks, and mostly serve product owners and data scientists of Elsevier’s Research Data Platform. Our work contributes to the delivery of leading data analytics products for the world of scientific research, including Scopus and SciVal. About The Role As a Senior Data Analyst, you have a solid understanding of best practices and can execute projects and initiatives without supervision from others. You are able to create advanced-level insights and recommendations. You are able to lead analytics efforts with high complexity, mostly independently. Responsibilities The Senior Data Analyst role will support data scientists working within the Domains of the Research Data Platform. Domains are functional units that are responsible for delivering one or more data products, often through data science algorithms. Supporting this work could lead to a wide range of different analytical activities. For example, you may be asked to dive into large datasets to answer questions from product owners or data scientists; you may need to perform large-scale data preparation (data prep) in order to test hypotheses or support prototypes; you may be asked to review the precision and recall of data science algorithms at scale and surface these as dashboard metrics. You will need to have a keen eye for detail, good analytical skills, and expertise in at least one data analysis system. Above all, you will need curiosity, dedication to high quality work, and an interest in the world of scientific research and the products that Elsevier creates to serve it. Because you will need to communicate with a range of stakeholders around the world, we ask for candidates to demonstrate a high level of English. Requirements Minimum work experience of 5 years Coding skills in at least one programming language (preferably Python) and SQL Familiarity with common string manipulation functions or libraries such as regular expressions (regex) Prior exposure to data analysis in a tabular form, for example with Pandas or Apache Spark/Databricks Experience of using basic statistics relevant to data science such as precision, recall and statistical significance Knowledge of visualization tools such as Tableau/Power BI is a plus Experience of working with Agile tools such as JIRA is a plus Stake Holder Management Build and maintain strong relationships with Data Scientists and Product Managers. Align activities with Data Scientists and Product Managers. Present achievements and project status updates, both written and verbally, to various stakeholders. Competencies Collaborates well and works effectively as part of a team Takes initiative and is proactive in suggesting approaches or solutions to problems Drives for results by taking a task to a polished conclusion Key Results Understand the requirements of a given task Identify, gather, prepare and refine data Interpret and understand large data sets Report findings to stakeholders through effective story telling Formulate recommendations and requirements Identify and address new opportunities Way that Works for You We promote a healthy work-life balance across the organization. We offer numerous well-being initiatives, shared parental leave, study assistance, and sabbaticals to help you meet both your immediate responsibilities and long-term goals. Working for You We understand that your well-being and happiness are essential to a successful career. Here are some benefits we offer: Comprehensive Health Insurance. Enhanced Health Insurance Options. Group Life Insurance. Group Accident Insurance. Flexible Working Arrangements. Employee Assistance Program. Medical Screening. Modern Family Benefits include maternity, paternity, and adoption support. Long Service Awards. Celebrating New Baby Gift. Subsidized Meals (location-specific). Various Paid Time Off options including Casual Leave, Sick Leave, Privilege Leave, Compassionate Leave, Special Sick Leave, and Gazetted Public Holidays. Free Transport for home-office-home commutes (location-specific). About The Business We are a global leader in information and analytics, assisting researchers and healthcare professionals in advancing science and improving health outcomes. We combine quality information and extensive data sets with analytics to support science and research, health education, and interactive learning. At our company, your work contributes to addressing the world's grand challenges and fostering a sustainable future. We utilize innovative technologies to support science and healthcare, partnering with us for a better world.

Posted 1 week ago

Apply

5.0 years

0 Lacs

India

Remote

Design, develop, and maintain scalable data pipelines using Databricks and Apache Spark. Write efficient Python code for data transformation, cleansing, and analytics. Collaborate with data scientists, analysts, and engineers to understand data needs and deliver high-performance solutions. Optimize and tune data pipelines for performance and cost efficiency. Implement data validation, quality checks, and monitoring. Work with cloud platforms (preferably Azure or AWS) to manage data workflows. Ensure best practices in code quality, version control, and documentation. Required Skills & Experience: 5+ years of professional experience in Python development. 3+ years of hands-on experience with Databricks (including notebooks, clusters, Delta Lake, and job orchestration). Strong experience with Spark (PySpark preferred). Proficient in working with large-scale data processing and ETL/ELT pipelines. Solid understanding of data warehousing concepts and SQL. Experience with Azure Data Factory, AWS Glue, or other data orchestration tools is a plus. Familiarity with version control tools like Git. Excellent problem-solving and communication skills. Job Type: Full-time Pay: ₹2,000,000.00 - ₹3,000,000.00 per year Schedule: Monday to Friday Experience: Spark: 6 years (Required) Azure Data: 5 years (Required) Git: 5 years (Required) CI/CD: 5 years (Required) DevOps: 5 years (Required) Databricks: 5 years (Required) Python : 5 years (Required) Work Location: Remote

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About Company Founded in 2011, ReNew, is one of the largest renewable energy companies globally, with a leadership position in India. Listed on Nasdaq under the ticker RNW, ReNew develops, builds, owns, and operates utility-scale wind energy projects, utility-scale solar energy projects, utility-scale firm power projects, and distributed solar energy projects. In addition to being a major independent power producer in India, ReNew is evolving to become an end-to-end decarbonization partner providing solutions in a just and inclusive manner in the areas of clean energy, green hydrogen, value-added energy offerings through digitalisation, storage, and carbon markets that increasingly are integral to addressing climate change. With a total capacity of more than 13.4 GW (including projects in pipeline), ReNew’s solar and wind energy projects are spread across 150+ sites, with a presence spanning 18 states in India, contributing to 1.9 % of India’s power capacity. Consequently, this has helped to avoid 0.5% of India’s total carbon emissions and 1.1% India’s total power sector emissions. In the over 10 years of its operation, ReNew has generated almost 1.3 lakh jobs, directly and indirectly. ReNew has achieved market leadership in the Indian renewable energy industry against the backdrop of the Government of India’s policies to promote growth of this sector. ReNew’s current group of stockholders contains several marquee investors including CPP Investments, Abu Dhabi Investment Authority, Goldman Sachs, GEF SACEF and JERA. Its mission is to play a pivotal role in meeting India’s growing energy needs in an efficient, sustainable, and socially responsible manner. ReNew stands committed to providing clean, safe, affordable, and sustainable energy for all and has been at the forefront of leading climate action in India. Job Description Key responsibilities: Understand, implement, and automate ETL pipelines with better industry standards Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, design infrastructure for greater scalability, etc Developing, integrating, testing, and maintaining existing and new applications Design, and create data pipelines (data lake / data warehouses) for real world energy analytical solutions Expert-level proficiency in Python (preferred) for automating everyday tasks Strong understanding and experience in distributed computing frameworks, particularly Spark, Spark-SQL, Kafka, Spark Streaming, Hive, Azure Databricks etc Limited experience in using other leading cloud platforms preferably Azure. Hands on experience on Azure data factory, logic app, Analysis service, Azure blob storage etc. Ability to work in a team in an agile setting, familiarity with JIRA and clear understanding of how Git works10. Must have 5-7 years of experience Key Responsibilities Understand, implement, and automate ETL pipelines with better industry standards Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, design infrastructure for greater scalability, etc Developing, integrating, testing, and maintaining existing and new applications Design, and create data pipelines (data lake / data warehouses) for real world energy analytical solutions Expert-level proficiency in Python (preferred) for automating everyday tasks Strong understanding and experience in distributed computing frameworks, particularly Spark, Spark-SQL, Kafka, Spark Streaming, Hive, Azure Databricks etc Limited experience in using other leading cloud platforms preferably Azure. Hands on experience on Azure data factory, logic app, Analysis service, Azure blob storage etc. Ability to work in a team in an agile setting, familiarity with JIRA and clear understanding of how Git works10. Must have 5-7 years of experience

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chandigarh

Remote

Additional Locations India-Chandigarh-Remote; India-Hyderabad-Remote; India-Mohali; India-Mohali-Remote; India-Mumbai; India-Mumbai-Remote Job ID R0000034469 Category IT ABOUT THIS ROLE Key Accountabilities : Responsible for design, develop, and maintain Power BI reports and dashboards for enterprise-wide users. Work closely with business users, business analysts, data engineers, and stakeholders to gather requirements and translate them into technical solutions. Analyze complex business and operational system requirements and recommend solution options. Integrate data from various sources into Power BI using SQL queries, SharePoint, and Dataflows to provide comprehensive insights. Write and optimize SQL queries to extract and manipulate data for reporting purposes. Participate in meetings and discussions to understand business needs and provide technical insights. Stay updated with the latest developments and best practices in Power BI, SQL, and Power Platform. Propose and implement improvements to existing reports, dashboards, and processes. Experience working in Agile development software methodology, authoring technical documents / specifications. Responsible to support the production environment to assist the business users for any issues related to data and reporting. Skills: A minimum of 5 years’ experience with the entire Power BI stack is required. Proficiency in Power BI, including Power Query, DAX, and Power BI Service. Strong understanding of data modeling, data warehousing, and ETL/ELT processes. Experience working with data sources like SQL Server, Oracle, Azure SQL, Azure Synapse, Databricks, Blob Storages & Data Lakes. Strong understanding of data visualization best practices. Excellent analytical and problem-solving skills. Familiarity with Agile development methodologies. Knowledge of standard ITL process. Excellent interpersonal, verbal and written communication skills. A flexible attitude with respect to work assignments and new learning. Ability to manage multiple and varied tasks with enthusiasm and prioritize workload with attention to detail. Willingness to work in a matrix environment. Good To Have: Experience with Microsoft Power Platform (Power Apps, Power Automate). Relevant certifications (e.g., Microsoft Certified: Data Analyst Associate, Microsoft Certified: Azure Data Engineer Associate, Microsoft Certified: Power BI, etc.). Experience using Git for version control and deployment. Knowledge of Microsoft Fabric. Experience coding in Python. Knowledge and Experience : 5 to 7 years of experience in Information Technology. Education: Bachelor’s degree in computer science, data science, software development, or another related field; a master’s degree is recommended.

Posted 1 week ago

Apply

0 years

4 - 6 Lacs

Hyderābād

On-site

Role Summary & Role Description Strong working experience in OOPS concepts using technologies like Java ,Spring are mandatory. Provide technical leadership for application architecture, technical design, and programming. Good understanding and working experience in full-stack software development lifecycle and services delivery. Demonstrated expertise in software architecture, designing scalable systems, and optimizing performance in the domain of UI, middleware, database, security, and API's. Proficiency in engineering best practices, object-oriented programming, design patterns, and producing clean, testable code, and SOLID principles. Understanding of HTTP, Internet protocols, web browsers and working experience using front-end frameworks such as HTML5, CSS, JSON, JavaScript and React JS. Experience in Core Java 11 - Multithreading, Exception handling, Garbage collection, Memory management. Strong experience in J2EE related technologies (Java Beans, JSP,JDBC, JMS, J2EE, Spring Boot, Hibernate etc) Experience with any message broker technologies ( Rabbit, Kafka, IBM MQ etc) Experience in API styles like SOAP, REST. Experience working on any data bases (SQL and NO SQL) Experience with containerization technologies like Docker, Kubernetes Knowledge and working experience with DevOps implementation and automated CI/CD pipeline using Git, Jenkins, SonarQube Experience in unit testing, mocking and integration testing Should have experience in developing the re-usable components in both front-end and back-end technologies. Work with other team members for one or more Agile scrum teams assigned to work on the project in parallel. Candidate should be able to clearly articulate the implications of design/architectural decisions. Strong attention to detail and problem-solving skills. Core/Must have skills React JS , HTML, CSS, JavaScript, Core Java 11,J2EE, Apache Spring Boot framework like Spring Core, Oracle/SQL, Kafka Experience using public cloud either Azure /AWS using Docker , Kubernetes Good to have skills Snowflake,Databricks, postgresql, MongoDB,CosmosDB Work Schedule Hybrid Why this role is important to us Our technology function, Global Technology Services (GTS), is vital to State Street and is the key enabler for our business to deliver data and insights to our clients. We’re driving the company’s digital transformation and expanding business capabilities using industry best practices and advanced technologies such as cloud, artificial intelligence and robotics process automation. We offer a collaborative environment where technology skills and innovation are valued in a global organization. We’re looking for top technical talent to join our team and deliver creative technology solutions that help us become an end-to-end, next-generation financial services company. Join us if you want to grow your technical skills, solve real problems and make your mark on our industry. About State Street What we do. State Street is one of the largest custodian banks, asset managers and asset intelligence companies in the world. From technology to product innovation, we’re making our mark on the financial services industry. For more than two centuries, we’ve been helping our clients safeguard and steward the investments of millions of people. We provide investment servicing, data & analytics, investment research & trading and investment management to institutional clients. Work, Live and Grow. We make all efforts to create a great work environment. Our benefits packages are competitive and comprehensive. Details vary by location, but you may expect generous medical care, insurance and savings plans, among other perks. You’ll have access to flexible Work Programs to help you match your needs. And our wealth of development programs and educational support will help you reach your full potential. Inclusion, Diversity and Social Responsibility. We truly believe our employees’ diverse backgrounds, experiences and perspectives are a powerful contributor to creating an inclusive environment where everyone can thrive and reach their maximum potential while adding value to both our organization and our clients. We warmly welcome candidates of diverse origin, background, ability, age, sexual orientation, gender identity and personality. Another fundamental value at State Street is active engagement with our communities around the world, both as a partner and a leader. You will have tools to help balance your professional and personal life, paid volunteer days, matching gift programs and access to employee networks that help you stay connected to what matters to you. State Street is an equal opportunity and affirmative action employer. Discover more at StateStreet.com/careers

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary Associate Director of DDIT US&I Analytics Operations is responsible is required to partner with with Business Stakeholders and DDIT Strategic Business Partners for demand analysis, solution proposal/evaluation and project delivery -Drive operations of systems and applications in scope (both Global and Local), ensuring their stability and integrity and meeting customer service levels and also oversee AI Ops, ML Ops, and LLM Ops, ensuring the efficient and scalable operation of AI and machine learning models. This role involves managing the entire model lifecycle, from development to deployment, monitoring, and maintenance, while adhering to predefined Service Level Agreements (SLAs). The position emphasizes innovation, rapid prototyping, and enterprise-scale deployment of analytics services. About The Role Role purpose: Oversee AI Ops, ML Ops, and LLM Ops to ensure efficient and scalable operations of AI and machine learning models. Manage the entire model lifecycle, including development, deployment, monitoring, and maintenance. Ensure adherence to predefined Service Level Agreements (SLAs) for AI and ML operations. Develop and maintain CI/CD Ops pipelines for seamless integration and deployment of models. Implement and manage model registries for version control and governance. Establish and enforce coding checklists and best practices. Develop and automate testing frameworks to ensure model quality and reliability. Design and manage inference pipelines for real-time and batch predictions. Drive innovation by adopting emerging technologies (GenAI, AI, NLP) and accelerating product/service development through rapid prototyping and iterative methods. Align analytics innovation efforts with business strategy, IT strategy, and legal/regulatory requirements. Identify and develop advanced analytics capabilities and ecosystem partnerships in alignment with DnA strategy. Key Responsibilities Lead AI Ops, ML Ops, and LLM Ops to ensure the efficient and scalable operation of AI and machine learning models. Develop and manage the model lifecycle, including development, deployment, monitoring, and maintenance. Ensure adherence to predefined SLAs for AI and ML operations. Create and manage analytics product/services roadmaps from concept to launch. Develop and maintain CI/CD Ops pipelines for seamless integration and deployment of models. Implement and manage model registries for version control and governance. Establish and enforce coding checklists and best practices. Develop and automate testing frameworks to ensure model quality and reliability. Design and manage inference pipelines for real-time and batch predictions. Incubate and adopt emerging technologies (GenAI, AI, NLP) to accelerate product/service development through rapid prototyping and iterative methods. Align analytics innovation efforts with business strategy, IT strategy, and legal/regulatory requirements. Establish and update strategies, implementation plans, and value cases for emerging technologies. Drive innovation using appropriate people, processes, partners, and tools. Identify and develop advanced analytics capabilities and ecosystem partnerships in alignment with DnA strategy. Oversee end-to-end delivery of analytics services and products across cross-functional business areas. Serve as the point of escalation, review, and approval for key issues and decisions. Manage resource and capacity planning in line with business priorities and strategies. Foster continuous improvement within the team. Decide on program timelines, governance, and deployment strategies. Key Performance Indicators Achieved targets in Enterprise business case contribution, KPIs, customer satisfaction, and innovation measures Delivery on agreed KPIs including business impact -Launch of innovative technology solutions across Novartis at scale Business impact and value generated from DDIT solutions -Adoption and development of Agile Productization and DevOps practices -Operations stability and effective risk management -Feedback on customer experience -Applications adherence to ISC requirements and are audit ready. Business capability, vision & strategy clearly defined, communicated, and executed, well aligned to business strategy and Enterprise IT strategy, and providing a competitive advantage to Novartis Role model with the highest standards of professional conduct in leading the business capability area in line with the new IT operating model Deployment of digital platforms and services at scale to deliver the digital strategy Skills And Experience Demonstrated experience in Budget Management. Business Acumen. Performance Management. Planning. Project Management, , Risk Management. Service Delivery Management and stakeholder management Strong understanding of AI Ops, ML Ops, and LLM Ops. Experience in developing and managing the model lifecycle, including deployment and maintenance. Proficiency in managing operations with predefined SLAs. Expertise in CI/CD Ops pipelines development. Experience with model registry and management. Knowledge of coding checklists and best practices. Proficiency in developing and automating testing frameworks. Experience in designing and managing inference pipelines. Production experience with commercial and open-source ML platforms. Strong knowledge of AWS, Databricks, and Snowflake service offerings. Ability to collaborate with business teams to gather requirements, groom product backlogs, and drive delivery. Agile delivery experience managing multiple concurrent delivery cycles. Solid foundation in CRISP analytical life cycle management. Strong leadership skills with the ability to build high-performing teams. Excellent vendor management and IT governance skills. Innovative and analytical mindset with a focus on continuous improvement. Emerging Technology Monitoring, Consulting, Influencing & persuading, Unbossed Leadership, IT governance, Building High Performing Teams, Vendor Management, Innovative & Analytical Technologies. Strong understanding of descriptive vs. prescriptive Analytical frameworks. Strong knowledge of visualization platforms and project life cycle management, including Power BI, Qlik, and MicroStrategy. Significant production experience addressing visualization platform and data pipeline performance constraints. Strong analytical and problem-solving skills, effective communication, and the ability to influence and collaborate with cross-functional teams. Commitment To Diversity And Inclusion Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards

Posted 1 week ago

Apply

8.0 years

0 Lacs

Hyderābād

On-site

Summary Are you passionate about the intersection of data, technology and science, and excited by the potential of Real-World Data (RWD) and AI? Do you thrive in collaborative environments and aspire to contribute to the discovery of groundbreaking medical insights? If so, join the data42 team at Novartis! At Novartis, we reimagine medicine by leveraging state-of-the-art analytics and our extensive internal and external data resources. Our data42 platform grants access to high-quality, multi-modal preclinical and clinical data, along with RWD, creating the optimal environment for developing advanced AI/ML models and generating health insights. Our global team of data scientists and engineers utilizes this platform to uncover novel insights and guide drug development decisions. As an RWD SME / RWE Execution Data Scientist, you will focus on executing innovative methodologies and AI models to mine RWD on the data42 platform. You will be the go-to authority for leveraging diverse RWD modalities patterns crucial to understanding patient populations, biomarkers, and drug targets, accelerating the development of life-changing medicines. About the Role Duties and Responsibilities: Collaborate with R&D stakeholders to co-create and implement innovative, repeatable, scaleable and automatable data and technology solutions in line with data42 strategy. Be a data SME, understand RWD of different modalities, vocabularies (LOINC, ICD, HCPCS etc.), non-traditional RWD (Patient reported outcomes, Wearables and Mobile Health Data) and where and how they can be used, including in conjunction with clinical data, omics data, pre-clinical data, and commercial data. Contribute to data strategy implementation such as Federated Learning, tokenization, data quality frameworks, regulatory requirements (submission data to HL7 FHIR formats conversion, Sentinel initiative), conversion to common data models and standards (OMOP, FHIR, SEND etc.), FAIR principles and integration with enterprise catalog Define and execute advanced integrated and scaleable analytical approaches and research methodologies (including industry trends) in support of exploratory and regulatory using AI models for RWD analysis across the Research Development Commercial continuum by facilitating research questions. Stay current with emerging applications and trends, driving the development of advanced analytic capabilities for data42 across the Real-world evidence generation lifecycle, from ideation to study design and execution. Demonstrate high agility working across various cross-located and cross-functional associates across business domains (commercial, Development, Biomedical Research) or Therapeutic area divisions for our priority disease areas to execute complex and critical business problems with quantified business impact/ROI. Draft and edit high-level research documents (proposals, protocols, statistical analysis plans). [optional] Knowledge of governance, ethical and privacy considerations [optional] Ideal Candidate Profile: PhD or MSc. in a quantitative discipline (e.g., but not restricted to Computer Science, Physics, Statistics, Epidemiology) with proven expertise in AI/ML. 8+ years of relevant experience in Data Science (or 4+ years post-qualification in case of PhD). Extensive experience in Statistical and Machine Learning techniques: Regression, Classification, Clustering, Design of Experiments, Monte Carlo Simulations, Statistical Inference, Feature Engineering, Time Series Forecasting, Text Mining, and Natural Language Processing, LLMs, and multi-modal Generative AI. Good to have skills: Stochastic models, Bayesian Models, Markov Chains, Optimization techniques including, Dynamic Programming Deep Learning techniques on structured and unstructured data, Recommender Systems. Proficiency in tools and packages: Python, R(optional), SQL; exposure to dashboard or web-app building using PowerBI, R-Shiny, Flask, open source or proprietary software and packages is an advantage. Knowledge in data standards e.g. OHDSI OMOP, and other data standards, FHIR HL7 for regulatory, and best practices. Good to have: Foundry, big data programming, working knowledge of executing data science on AWS, DataBricks, SnowFlake Strong in Matrix collaboration environments with good communication and collaboration skills with country/ regional/ global stakeholders in an individual contributor capacity. High learning agility and adherence to updates in industry and area of work. Optional Experience in Biomedical Research and development in pharma is a bonus. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division Biomedical Research Business Unit Innovative Medicines Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Alternative Location 1 Dublin (NOCC), Ireland Functional Area Research & Development Job Type Full time Employment Type Regular Shift Work No Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to [email protected] and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderābād

On-site

JOB DESCRIPTION We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Firmwide Resiliency Office Tech (ERMA), you will be a highly skilled and experienced Senior Reporting developer with expertise in Tableau, Qlik, IBM Cognos, and other BI tools. You will design, develop, and lead complex reporting and analytics solutions, promoting the delivery of self-service reporting, operational dashboards, and compliance solutions. You will leverage your deep knowledge of data visualization, ETL processes, and data warehousing, interacting with business users and the reporting and DB/ETL team to enforce best practices and ensure high-quality deliverables in a timely fashion. You should have a proven track record of strong communication and people skills and experience working in a structured and agile software development lifecycle. Job responsibilities Intake, analyze, design, model, develop, assure quality, and support requirements. Collaborate with business users to model Business Analytics. Design and implement interactive reports and dashboards using Tableau. Ensure data accuracy and integrity in Dashboards/Reports. Troubleshoot and resolve issues. Write complex relational Oracle queries. Innovate to deliver business value, improve efficiency, and reduce cost. Required qualifications, capabilities, and skills. Formal training or certification on software engineering concepts and 3+ years applied experience Experience in data warehousing, business intelligence, and reporting platforms, with a strong focus on Tableau, Qlik and IBM Cognos. Proven expertise in developing interactive and dynamic dashboards, reports, and visualizations. Good understanding of data warehouse concepts and working experience in relational and dimensional modelling is required. Technical depth and ability to rapidly understand and communicate the impact of solutions and technology. Experience collaborating with business representatives to understand requirements, value, prioritization and business intelligence needs. Good analytical, problem-solving, and communication skills. Ability to work both independently and in a team environment, with a proactive approach to managing multiple priorities. Experience working in onshore/offshore model. Preferred qualifications, capabilities, and skills Familiarity with modern technologies Exposure to cloud technologies Any experience reporting on Databricks platform. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM Our Corporate Technology team relies on smart, driven people like you to develop applications and provide tech support for all our corporate functions across our network. Your efforts will touch lives all over the financial spectrum and across all our divisions: Global Finance, Corporate Treasury, Risk Management, Human Resources, Compliance, Legal, and within the Corporate Administrative Office. You’ll be part of a team specifically built to meet and exceed our evolving technology needs, as well as our technology controls agenda.

Posted 1 week ago

Apply

3.0 years

1 - 10 Lacs

Hyderābād

On-site

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Firmwide Resiliency Office Tech (ERMA), you will be a highly skilled and experienced Senior Reporting developer with expertise in Tableau, Qlik, IBM Cognos, and other BI tools. You will design, develop, and lead complex reporting and analytics solutions, promoting the delivery of self-service reporting, operational dashboards, and compliance solutions. You will leverage your deep knowledge of data visualization, ETL processes, and data warehousing, interacting with business users and the reporting and DB/ETL team to enforce best practices and ensure high-quality deliverables in a timely fashion. You should have a proven track record of strong communication and people skills and experience working in a structured and agile software development lifecycle. Job responsibilities Intake, analyze, design, model, develop, assure quality, and support requirements. Collaborate with business users to model Business Analytics. Design and implement interactive reports and dashboards using Tableau. Ensure data accuracy and integrity in Dashboards/Reports. Troubleshoot and resolve issues. Write complex relational Oracle queries. Innovate to deliver business value, improve efficiency, and reduce cost. Required qualifications, capabilities, and skills. Formal training or certification on software engineering concepts and 3+ years applied experience Experience in data warehousing, business intelligence, and reporting platforms, with a strong focus on Tableau, Qlik and IBM Cognos. Proven expertise in developing interactive and dynamic dashboards, reports, and visualizations. Good understanding of data warehouse concepts and working experience in relational and dimensional modelling is required. Technical depth and ability to rapidly understand and communicate the impact of solutions and technology. Experience collaborating with business representatives to understand requirements, value, prioritization and business intelligence needs. Good analytical, problem-solving, and communication skills. Ability to work both independently and in a team environment, with a proactive approach to managing multiple priorities. Experience working in onshore/offshore model. Preferred qualifications, capabilities, and skills Familiarity with modern technologies Exposure to cloud technologies Any experience reporting on Databricks platform.

Posted 1 week ago

Apply

7.0 - 10.0 years

5 - 10 Lacs

Gurgaon

On-site

Senior Manager EXL/SM/1408721 ServicesGurgaon Posted On 01 Jul 2025 End Date 15 Aug 2025 Required Experience 7 - 10 Years Basic Section Number Of Positions 3 Band C2 Band Name Senior Manager Cost Code D014126 Campus/Non Campus NON CAMPUS Employment Type Permanent Requisition Type New Max CTC 2500000.0000 - 3200000.0000 Complexity Level Not Applicable Work Type Hybrid – Working Partly From Home And Partly From Office Organisational Group Analytics Sub Group Analytics - UK & Europe Organization Services LOB Analytics - UK & Europe SBU Analytics Country India City Gurgaon Center EXL - Gurgaon Center 38 Skills Skill AZURE DATABRICKS MACHINE LEARNING PYSPARK Minimum Qualification ANY GRADUATE Certification No data available Job Description We are seeking a dynamic professional with strong experience in Databricks and Machine Learning to design and implement scalable data pipelines and ML solutions. The ideal candidate will work closely with data scientists, analysts, and business teams to deliver high-performance data products and predictive models. Key Responsibilities: Design, develop, and optimize data pipelines using Databricks , PySpark , and Delta Lake Build and deploy Machine Learning models at scale Perform data wrangling , feature engineering , and model tuning Collaborate with cross-functional teams for ML model integration and monitoring Implement MLflow for model versioning and tracking Ensure best practices in MLOps , code management, and automation Must-Have Skills: Hands-on experience with Databricks , Spark , and SQL Strong knowledge of ML algorithms , Python (Pandas, Scikit-learn), and model deployment Familiarity with cloud platforms (Azure / AWS / GCP) Experience with CI/CD pipelines and ML lifecycle management tools Good to Have: Exposure to data governance , monitoring tools , and performance optimization Knowledge of Docker/Kubernetes and REST API integration Workflow Workflow Type L&S-DA-Consulting

Posted 1 week ago

Apply

7.0 years

5 - 8 Lacs

Gurgaon

On-site

Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do The Global Information and AI Security Senior Manager provides internal BCG technical consulting around information security architecture and security design measures for new projects, ventures and systems. The architect defines the desired end state to meet solution Security Goals and overall business goals. The Security Architect ensures the digital applications, tools, and services protect our data, our clients’ data, and our intellectual property; are resilient to cyber-attack; meet BCG policy and standards, regulatory requirements, and industry best practices; while using a risk-based approach to meeting BCG business needs and objectives. The Global Information and AI Security Senior Manager works with teams inside BCG to secure the building and maintenance of complex computing environments to train, deploy, and operate Artificial Intelligence/ML systems by determining security requirements; planning, implementing and testing security systems; participate in AI/ML/LLM projects as the Security Subject Matter Expert; preparing security standards, policies and procedures; and mentoring team members. What You'll Bring Bachelor's degree (or equivalent experience) required. CSSLP certification required; additional certifications such as CISSP, CCSP, or CCSK strongly preferred. 7+ years of progressive experience in information security, specifically focused on secure architecture, secure development practices, and cloud-native security. Proven expertise supporting software engineering, data science, and AI/ML development teams, specifically with secure model lifecycle management, secure deployment practices, and secure data engineering. Expert understanding of the Secure Software Development Lifecycle (SSDLC), including secure architecture, threat modeling frameworks (e.g., MAESTRO, PASTA, STRIDE), penetration testing, secure coding practices, vulnerability management, and incident response. Demonstrated technical proficiency across multiple security technologies, platforms, and frameworks, with strong hands-on experience implementing secure cloud-native infrastructures (AWS, Azure, GCP). Familiarity with data warehouse and data lake environments such as Databricks, Azure Fabric, or Snowflake, including security best practices in managing and securing large-scale data ecosystems. In-depth knowledge and practical experience with AI and machine learning model security, ethical AI frameworks, secure handling of data, and comprehensive understanding of CI/CD pipelines specifically tailored for data science workloads. Extensive experience conducting security assessments, vulnerability triage, intrusion detection and prevention, firewall management, network vulnerability analysis, cryptographic implementations, and incident response analysis. Exceptional communication skills (written and oral), influencing capabilities, and ability to clearly articulate complex security concepts to stakeholders across various levels of the organization. Proactive professional development, continuous learning, active participation in industry forums, professional networks, and familiarity with current and emerging security trends and standards. Additional info YOU'RE GOOD AT The Senior Manager, Security and AI Architect excels at: Collaborating closely with software engineering, data science, data engineering, and cybersecurity teams to design, implement, and maintain secure solutions in agile environments leveraging cloud-native technologies and infrastructure. Defining security requirements by deeply understanding business objectives, evaluating strategies, and implementing robust security standards throughout the full Software Development Life Cycle (SDLC). Leading security risk assessments, threat modeling (utilizing frameworks such as MAESTRO, PASTA, STRIDE, etc.), security architecture reviews, and vulnerability analyses for client-facing digital products, particularly involving complex AI/ML-driven solutions. Advising development teams, including AI engineers and data scientists, on secure coding practices, secure data handling, secure AI/ML model deployment, and related infrastructure security considerations. Providing specialized guidance on secure AI model development lifecycle, including secure data usage, ethical AI practices, and robust security controls in Generative AI and large language model deployments. Actively participating in the APAC Dex process for managing digital builds, ensuring alignment with regional requirements, standards, and best practices. Staying ahead of emerging security trends and technologies, conducting continuous research, evaluation, and advocacy of new security tools, frameworks, and architectures relevant to digital solutions. Ensuring robust compliance with regulatory frameworks and industry standards, including ISO 27001, SOC2, NIST, and GDPR, particularly as they pertain to data privacy and AI-driven product development. Developing and delivering training programs on secure development, AI security considerations, and incident response practices. Partnering with internal stakeholders, articulating security risks clearly, influencing technical directions, and promoting comprehensive secure architecture roadmaps. Conducting vendor and market assessments, guiding tests, evaluations, and implementation of security products that address enterprise and client-specific information security requirements. Advising teams on compensating controls and alternative security measures to facilitate business agility without compromising security posture. Leading the implementation and continuous improvement of security tooling and practices within CI/CD pipelines, infrastructure-as-code (IaC), and model deployment automation. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Overview As a member of the data engineering team, you will be the key technical expert developing and overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be an empowered member of a team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems Responsibilities Active contributor to code development in projects and services. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance. Responsible for implementing best practices around systems integration, security, performance and data management. Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape. Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners. Develop and optimize procedures to “productionalize” data science models. Define and manage SLA’s for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries Qualifications 6+ years of overall technology experience that includes at least 4+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark, Scala etc.). 2+ years in cloud data engineering experience in Azure. Fluent with Azure cloud services. Azure Certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modeling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse or SnowFlake. Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus Understanding of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). BA/BS in Computer Science, Math, Physics, or other technical fields.

Posted 1 week ago

Apply

5.0 years

3 - 9 Lacs

Noida

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. The Cloud Migration and Modernization Engineering (CMME) team is seeking an experienced, lead-level, cloud solution architect with a demonstrated ability to evaluate, solution, estimate, implement and document cloud centric application architectures and deployment methodologies. The candidate must be able to advise application and data migration efforts and directly contribute to the design and implementation of varied application solutions, IaC implementation, and operational capabilities. We solution for and operate across Azure, AWS, and GCP cloud service providers, however, this position will have an Azure focus. Team members in this role must understand development and deployment workflows and be capable of providing appropriate and actionable guidance pertinent to re-factoring applications towards more cloud optimal architectures and alignment to cloud well-architected best-practices. Individuals in this role must have the ability to drive technical solutioning efforts, mentor team members, and work in collaboration with other teams to drive application migration and modernization efforts. Primary Responsibility: Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Azure Certification(s) (verifiable) Azure - 5+ years solutioning and deploying applications in Azure cloud Lead Experience - 3+ years in a role demonstrating ability to own and drive deliverables with minimal technical guidance Experience architecting for both Linux and Windows environments Azure function apps (serverless architectures) Azure Container Apps / AKS (proficiency with containerized deployment architectures) Azure Database & Cosmos DB (relational and NoSQL database design, data management) Networking (solid understanding of general and cloud virtual networking) Azure cloud security and recovery practices Web services and API design Azure data streaming/messaging services (e.g. Stream Analytics, Functions, Kafka) IaC with Terraform CI/CD automation using GitHub / GitHub Actions Implementation of auto-scaling, cost-management techniques, and operational monitoring Preferred Qualifications: Golang / Python / .Net-C# / Java (Prefer at least one, with a leaning towards Golang) Experience with alternative cloud service providers (AWS, GCP) Hashicorp Vault Kubernetes Databricks pipelines MLOps At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 1 week ago

Apply

2.0 - 4.0 years

5 - 8 Lacs

Noida

On-site

Expertise in AWS services like EC2, CloudFormation, S3, IAM, SNS, SQS, EMR, Athena, Glue, lake formation etc. ? Expertise in Hadoop/EMR/DataBricks with good debugging skills to resolve hive and spark related issues. Sound fundamentals of database concepts and experience with relational or non-relational database types such as SQL, Key-Value, Graphs etc. Experience in infrastructure provisioning using CloudFormation, Terraform, Ansible, etc. Experience in programming languages such as Python/PySpark. Excellent written and verbal communication skills. Key Responsibilities Working closely with the Data lake engineers to provide technical guidance, consultation and resolution of their queries. Assist in development of simple and advanced analytics best practices, processes, technology & solution patterns and automation (including CI/CD) Working closely with various stakeholders in US team with a collaborative approach. Develop data pipeline in python/pyspark to be executed in AWS cloud. Set up analytics infrastructure in AWS using cloud formation templates. Develop mini/micro batch, streaming ingestion patterns using Kinesis/Kafka. Seamlessly upgrading the application to higher version like Spark/EMR upgrade. Participates in the code reviews of the developed modules and applications. Provides inputs for formulation of best practices for ETL processes / jobs written in programming languages such as PySpak and BI processes. Working with column-oriented data storage formats such as Parquet , interactive query service such as Athena and event-driven computing cloud service - Lambda Performing R&D with respect to the latest and greatest Big data in the market, perform comparative analysis and provides recommendations to choose the best tool as per the current and future needs of the enterprise. Required Qualifications Bachelors or Masters degree in Computer Science or similar field 2-4 years of strong expeirence in big data development Expertise in AWS services like EC2, CloudFormation, S3, IAM, SNS, SQS, EMR, Athena, Glue, lake formation etc. Expertise in Hadoop/EMR/DataBricks with good debugging skills to resolve hive and spark related issues. Sound fundamentals of database concepts and experience with relational or non-relational database types such as SQL, Key-Value, Graphs etc. Experience in infrastructure provisioning using CloudFormation, Terraform, Ansible, etc. Experience in programming languages such as Python/PySpark. Excellent written and verbal communication skills. Preferred Qualifications Cloud certification (AWS, Azure or GCP) About Our Company Ameriprise India LLP has been providing client based financial solutions to help clients plan and achieve their financial objectives for 125 years. We are a U.S. based financial planning company headquartered in Minneapolis with a global presence. The firm’s focus areas include Asset Management and Advice, Retirement Planning and Insurance Protection. Be part of an inclusive, collaborative culture that rewards you for your contributions and work with other talented individuals who share your passion for doing great work. You’ll also have plenty of opportunities to make your mark at the office and a difference in your community. So if you're talented, driven and want to work for a strong ethical company that cares, take the next step and create a career at Ameriprise India LLP. Ameriprise India LLP is an equal opportunity employer. We consider all qualified applicants without regard to race, color, religion, sex, genetic information, age, sexual orientation, gender identity, disability, veteran status, marital status, family status or any other basis prohibited by law. Full-Time/Part-Time Full time Timings (2:00p-10:30p) India Business Unit AWMPO AWMP&S President's Office Job Family Group Technology

Posted 1 week ago

Apply

4.0 years

10 - 12 Lacs

Noida

Remote

Job Role: Azure Data Engineer Exp- 4+ Years Location: Remote Type: Contract 6 Month+ Extendable Job Overview: An Azure Data Engineer is responsible for designing, building, and maintaining scalable data solutions using Microsoft Azure services. The role typically involves working with big data tools, ETL pipelines, data lakes, and data warehousing solutions. 2. Key Responsibilities: Design and implement data pipelines and ETL processes using Azure Data Factory. Build and maintain data lakes and warehouses using Azure Synapse or Azure Data Lake. Integrate and transform data from various sources. Optimize data storage and access for performance and cost-efficiency. Ensure data security and compliance across cloud platforms. Collaborate with Data Scientists, Analysts, and other Engineers. 3. Required Skills & Technologies: Azure Services: Azure Data Factory, Azure Data Lake, Azure Synapse Analytics, Azure SQL Database, Azure Databricks. Data Integration: ETL/ELT, Data ingestion from on-prem and cloud sources. Programming Languages: Python, SQL, Spark (PySpark). Data Modeling & Warehousing: Star/Snowflake schemas, dimensional modeling. Big Data Frameworks: Spark, Hive, Databricks (optional but preferred). DevOps Tools: Azure DevOps, CI/CD pipelines, Git. Monitoring & Security: Role-based access control (RBAC), data encryption, monitoring/logging tools. Job Types: Full-time, Permanent Pay: ₹90,000.00 - ₹100,000.00 per month Experience: Azure Data Engineer: 4 years (Required) Databricks: 4 years (Required) Pyspark: 4 years (Required) Work Location: In person

Posted 1 week ago

Apply

20.0 years

6 - 7 Lacs

Noida

On-site

Open Position: UX Intern We exist to build great software. It’s our shared purpose and brand promise to our customers. We offer technology services in data & analytics, generative AI, machine learning, reporting & visualization, application modernization, cloud optimization, and security. Clients consistently recognize us for our technical leadership and excellence in execution. Recognized 11 times on the Inc. 5000 list for sustained organic growth, MAQ Software continues to evolve as a leader in enterprise software and AI innovation—delivering measurable impact and long-term value to our clients. As a Microsoft Fabric Featured Partner , we help leading enterprises accelerate their business intelligence and analytics initiatives. Our solutions empower clients to improve their operations, reduce costs, increase sales, and strengthen customer relationships. To enhance Power BI capabilities, we have published 43 certified custom visuals , now with over 3 million downloads . We are the third largest provider of Power BI visuals on Microsoft AppSource. Our most popular visuals include Gantt Chart , Calendar Visual , Organization Chart , and Box and Whisker Chart . We operate within the professional services spectrum , focusing on procedure services —where specialized knowledge, repeatable processes, and expert solutions differentiate us from those offering standardized, commodity services. Our emphasis on technical expertise, consistent delivery, and agility is why clients trust us with complex, high-impact projects. Our software delivery model is built on: Daily software updates Agile project management Transparent workflows Domain-specific expertise Rapid feedback and iteration cycles Exclusively focused on enterprise customers, we currently serve many Fortune 500 companies . As a premier supplier to Microsoft for over 20 years , our clients benefit from deep platform knowledge and mature engineering practices. Microsoft has awarded us eleven specializations , recognizing our commitment to delivering service excellence at scale. With globally distributed teams across Redmond, Washington and Noida, Mumbai, and Hyderabad, India , we operate with increased velocity and tech intensity . Our integrated teams deliver solutions that are both scalable and adaptable to evolving business needs. Fueling the Future of AI Engineering Our commitment to innovation continues with the opening of our new AI Engineering Center in Noida —a 12-story, 350,000-square-foot campus that strengthens our presence in India and supports the future of enterprise AI solutions. Key Features of the Noida Campus Modern, Collaborative Workspaces: Designed to encourage creativity and teamwork, the campus features open workspaces, collaborative zones, and advanced technology for a productive, engaging environment. AI Innovation Hub: This campus will be a focal point for AI research and development, with a special emphasis on Agentic AI, Large Language Models (LLMs), machine learning, natural language processing, and data analytics. Enhanced Client Collaboration: The new facility will enable closer collaboration with clients, allowing us to deliver tailored AI solutions that meet their specific needs. Talent Development: We are committed to nurturing the next generation of AI engineers. The Noida campus includes dedicated spaces for training, mentorship, and career growth. Sustainability Highlights Designed for LEED Gold certification A 598 KW solar power system supplies nearly 50% of the building’s energy Zero-discharge water system with rainwater harvesting Energy-efficient facade with solar heat-reducing fins Engineering Culture: Innovation at Heart: A Culture of Agility and Growth · Can-Do Spirit: At MAQ Software, we cultivate a collaborative engineering environment where a "can-do" attitude thrives. · Experienced Leadership: Our key managers boast impressive academic backgrounds and a proven track record of growing the company and mentoring software engineers. · Embracing the Future: Our lean structure allows for agility and rapid adoption of cutting-edge technologies and computing trends, keeping us ahead of the curve. · Cutting-Edge & Fast-Paced: Be at the forefront of technology! Our globally distributed engineering team tackles challenging problems using the latest software practices and rapid development cycles. · Continuous Learning & Innovation: Our collaborative and supportive environment fosters innovation and growth. You'll constantly learn and push boundaries alongside talented colleagues. · Rewarding Career Path: We offer exciting and rewarding work experience that will propel your career forward. Examples of some of our projects: § We built an analytics platform for a leading fintech company using Azure services, enabling them to scale to 1000+ customers and provide self-service, near real-time analytics. Our solution, based on Azure Synapse, Azure Data Lake Storage, Azure Data Factory, Azure Databricks, and Power BI, followed the best practices of the Azure Well-Architected Framework, and leveraged migration strategies from Microsoft’s Cloud Adoption Framework. Our automated deployment framework reduced setup time from days to hours. The platform now offers powerful self-service analytics, enabling their customers to reach millions of customers faster and easily integrate machine learning models for innovation. § We helped an organic supermarket chain migrate to Microsoft Power BI and Azure Synapse to improve their reporting and data analytics capabilities. The previous system was time-consuming, error-prone, and offered limited visualization and self-service capabilities. With the new solution, the client can now perform direct queries between front-end Power BI reports and back-end data, enabling real-time insights and a holistic view across teams. Azure Synapse also provides higher data security. The migration resulted in automatic report generation, reduced operational costs, increased ROI, and better business decisions. § We are developing multiple products to revolutionize the Power BI experience. EmbedFAST simplifies Power BI integration into applications, eliminating the need for complex coding. LoadFAST unlocks the full potential of Power BI with our automation toolkit. Experience faster loading times and smoother user experience. CertyFAST will streamline Power BI model development with automated error detection, DAX measure formatting, and simplified documentation. Ensure top-notch quality and adherence to best practices in enterprises. § We developed a chatbot for the Arizona Department of Economic Security (DES) to improve its Program Service Evaluator (PSE) training. The chatbot used Microsoft Azure Cognitive Services to answer PSE questions based on the policy manual of various state benefit programs. The chatbot learned from user feedback and crawled the policy content automatically. The chatbot increased evaluation efficiency, reduced senior staff time, and provided conversational responses to PSEs. The chatbot integrated seamlessly into the PSEs’ workflow and was accessible through a web interface and Skype for Business. § Our client, a leader in the energy and utilities sector, needed a scalable solution to analyze large volumes of IoT data from diverse applications. We used Power BI and Azure Data Explorer (ADX) to ingest, transform, and visualize semi-structured JSON data from network logs. Our solution reduced the data refresh time, handled dynamic schema changes, and enabled comprehensive analysis using Direct Query mode. Our client gained real-time insights and a competitive edge in the industry with our robust and effective solution. The solution also facilitated efficient data ingestion and transformation, effectively tackled issues pertaining to data refresh and dynamic schema of data. § We helped a multinational food and beverage chain to improve its sales forecasting accuracy by building a hybrid machine learning model on Azure Databricks. The model used historical sales, weather, and event data to predict future sales and analyze the impact of various factors on sales. The model also detected and explained seasonal and daily spikes and lag periods. Our model reduced the MAPE value from 0.13 to 0.09 and enabled the client to make better business decisions. To read about some of our recent projects, visit https://maqsoftware.com/case-studies Job Responsibilities: Job Description: We are looking for a user-experienced designer with a strong interest in the design and development of engaging user experiences for business users. The ideal candidate will thrive in a work environment that requires strong problem-solving skills, independence, and self-direction. As a key member of a dynamic and fast-moving design team, your primary focus will be to create elegant and customer-focused designs that provide superior experience working in close collaboration with business, product, technology, and design teams across international locations. The candidate will also gain extensive experience in a fast-paced and innovative software development environment. Primary Responsibilities: Demonstrate consistency and precision in UI design, ensuring seamless prototyping and fast-paced execution Collaborate on user experience planning with engineers and clients Remain open to learning new skills and adapting to evolving design requirements Maintain a quick turnaround time while managing multiple work items and projects in parallel Conduct not only research and ideation but also perform hands-on design work with meticulous attention to detail Participate in design reviews, contribute thoughtful feedback and creative solutions Collaborate with peers, share creative input, and help propagate the vision for ongoing projects. Execute interaction and visual design as part of a distributed engineering team Take concepts through to production, including creating red-lines, prototypes, guidelines, and effective communication materials to realize design intent Create information architecture and task-flow diagrams, wireframes, and interactive prototypes Create interactive designs for dashboards, web apps, websites, web portals, Power BI, Power Apps, and similar platforms Design graphics elements and design assets, such as posters, flyers, banners, logos, icons, product demo videos, etc. Create Power BI reports with impactful data visualizations Create/edit presentations in PowerPoint Ensure compliance with company guidelines, deadlines, and design standards Assist in addressing user experience and interface design needs based on existing guidelines Create three variations of design concepts exploring different approaches and decisions Demonstrate hands-on experience with design tools, prioritizing Figma as the primary tool, along with Adobe Creative Cloud Suite Expedite design work by leveraging AI tools, including proficiency with popular AI plugins in the design industry to create initial drafts Understand and apply design frameworks such as Microsoft Fluent UI, Material Design, atomic design principles, and WCAG accessibility standards Solve diverse design challenges creatively and efficiently, with a strong grasp of interaction, usability, graphic design, and knowledge of HTML, CSS, and React-based design systems Candidate Profile : · Bachelor’s or master’s degree in interaction design, new media design or design related field · Ability to take on a design challenge and think through the solution in terms of business impact, user experience and engineering overhead · Excellent written and verbal communication skills, with both technical and not-so-technical people · Ability to explain why a design works or why it doesn’t · Great working knowledge of usability principles, best practices, and techniques · A portfolio demonstrating experience with UX/UI design · Exceptional design skills, production value and attention to detail · Strong working knowledge of Figma and Adobe Creative Cloud (including but not limited to Illustrator, Photoshop, After Effects, Premier Pro) and other associated design tools · Experience with user interface design patterns and standard UCD methodologies · Self-managed with a high degree of ownership, accountability, and work ethics and able to work in a fast-paced environment Compensation: Campus: MAQ Software, A-3, Sector 145, Noida Interview Preparation: · Review Gartner’s two modes of IT (Fast IT versus Slow IT) · Review Founder’s Mentality by Bain and Company (http://www.bain.com/publications/business-insights/founders-mentality.aspx ) · Review of What I Did Not Learn in B – School and What I Did Not Learn at IIT by Rajeev Agarwal, Founder and Managing Consultant. These books will help you learn about our company culture. The books are available from your campus student coordinator · Showcase your interest and aptitude in technology services industry Job Type: Full-time Pay: ₹600,000.00 - ₹700,000.00 per year Work Location: In person

Posted 1 week ago

Apply

5.0 years

0 Lacs

Noida

On-site

Engineering at Innovaccer With every line of code, we accelerate our customers' success, turning complex challenges into innovative solutions. Collaboratively, we transform each data point into valuable insights. Join us and be part of a team that’s turning the vision of better healthcare into reality—one line of code at a time. Together, we’re shaping the future and making a meaningful impact on the world. About the Role Technology that once promised to simplify patient care has, in many cases, created more complexity. At Innovaccer, we tackle this challenge by leveraging the vast amount of healthcare data available and replacing long-standing issues with intelligent, data-driven solutions. Data is the foundation of our innovation. We are looking for a Senior AI Engineer who understands healthcare data and can build algorithms that personalize treatments based on a patient’s clinical and behavioral history. This role will help define and build the next generation of predictive analytics tools in healthcare. A Day in the Life Design and build scalable AI platform architecture to support ML development, agentic frameworks, and robust self-serve AI pipelines. Develop agentic frameworks and a catalog of AI agents tailored for healthcare use cases. Design and deploy high-performance, low-latency AI applications. Build and optimize ML/DL models, including generative models like Transformers and GANs. Construct and manage data ingestion and transformation pipelines for scalable AI solutions. Conduct experiments, statistical analysis, and derive insights to guide development. Collaborate with data scientists, engineers, product managers, and business stakeholders to translate AI innovations into real-world applications. Partner with business leaders and clients to understand pain points and co-create scalable AI-driven solutions. Experience with Docker, Kubernetes, AWS/Azure, Snowflake, and healthcare data systems. Preferred Skills Proficient in Python for building scalable, high-performance AI applications Experience with reinforcement learning and multi-agent systems LLM optimization and deployment at scale Familiarity with healthcare data and real-world AI use cases Requirements What You Need Master’s in Computer Science, Engineering, or a related field. 5+ years of software engineering experience with strong API development skills. 3+ years of experience in data science, including at least 1+ year building generative AI pipelines, agents, and RAG systems. Strong Python programming skills, particularly in enterprise application development and optimization. Experience with: LLMs, prompt engineering, and fine-tuning SLMs Frameworks like LangChain, CrewAI, or Autogen (experience with at least one is required) Vector databases (e.g., FAISS, ChromaDB) Embedding models and Retrieval-Augmented Generation (RAG) design Familiarity with at least one ML platform (Databricks, Azure ML, SageMaker) Benefits Here’s What We Offer Generous Leave Policy: Up to 40 days of leave annually Parental Leave: One of the industry’s best parental leave policies Sabbatical Leave: Take time off for upskilling, research, or personal pursuits Health Insurance: Comprehensive coverage for you and your family Pet-Friendly Office*: Bring your furry friends to our Noida office Creche Facility for Children*: On-site care at our India offices Pet-friendly and creche facilities are available at select locations only (e.g., Noida for pets)

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies