Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What You Will Do As a Data Engineer, you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / Bachelor's degree and 5 to 9 years of Computer Science, IT or related field experience. Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong knowledge on Oracle / SQL Server, Stored Procedure, PL/SQL, Knowledge on LINUX OS Knowledge on Data visualization and analytics tools like Spotfire, PowerBI Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: Databricks Certificate preferred AWS Data Engineer/Architect Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 4 days ago
0 years
0 Lacs
Tiruchirappalli, Tamil Nadu, India
On-site
Company: ADRIG AI Technologies is a dynamic service-based company specializing in web development, artificial intelligence, game development, and tech talent acquisition. We deliver end-to-end solutions — from responsive websites and AI-powered tools to full-fledged game engines and skilled tech professionals for global projects. This opportunity is part of ADRIG’s newly launched Edutech vertical, FutureMinds — an initiative aimed at transforming school-level AI education through immersive, real-world experiences. FutureMinds blends cutting-edge technology with engaging communication to spark curiosity and confidence in the next generation of innovators. Locations: Thiruvallur (Perambakkam, Polivakkam, Pallipet) Trichy (Yagapuram) Sivagangai (Kodikottai) Kanyakumari (Aralvaimozhi, Manavalakurichi) Job Type: Part-Time | Short-Term Engagement (2 months) Workload & Compensation: 2 to 8 hours of work per week Each session is 2 hours long ₹1,000 per session Immediate Joiners Preferred About the Opportunity: We are inviting applications for a part-time role ideal for individuals with both technical proficiency and exceptional communication skills. Candidates should have a background in Computer Science and Engineering (B.Tech CSE) and be confident public speakers capable of clearly and engagingly articulating AI, math, and technology concepts. Key Responsibilities: Deliver short, structured sessions in English Simplify and present technical topics (AI, tech, math) in an engaging way Communicate effectively with school-age learners in a structured offline setting Represent the organization with energy, clarity, and professionalism Qualifications: B.Tech in Computer Science or related field (required) Excellent spoken English; prior public speaking or anchoring experience preferred Basic understanding of AI and mathematics Strong interpersonal skills and stage presence Experience as an RJ, emcee, communicator, educator, or content presenter is a plus How to Apply: Interested candidates may message us directly on LinkedIn with their contact information.
Posted 4 days ago
3.0 years
0 Lacs
India
Remote
We are seeking a skilled Sr Azure Data Engineer with hands-on experience in modern data engineering tools and platforms within the Azure ecosystem . The ideal candidate will have a strong foundation in data integration, transformation, and migration , along with a passion for working on complex data migration projects . Job Title: Sr. Azure Data Engineer Location: Remote work Work Timings: 2:00 PM – 11:00 PM IST No of Openings: 2 Please Note: This is a pure Azure-specific role . If your expertise is primarily in AWS or GCP , we kindly request that you do not apply . Lead the migration of large-scale SQL workloads from on-premise environments to Azure, ensuring high data integrity, minimal downtime, and performance optimization. Design, develop, and manage end-to-end data pipelines using Azure Data Factory or Synapse Data Factory to orchestrate migration and ETL processes. Build and administer scalable, secure Azure Data Lakes to store and manage structured and unstructured data during and post-migration. Utilize Azure Databricks , Synapse Spark Pools , Python , and PySpark for advanced data transformation and processing. Develop and fine-tune SQL/T-SQL scripts for data extraction, cleansing, transformation, and reporting in Azure SQL Database , SQL Managed Instances , and SQL Server . Design and maintain ETL solutions using SQL Server Integration Services (SSIS) , including reengineering SSIS packages for Azure compatibility. Collaborate with cloud architects, DBAs, and application teams to assess existing workloads and define the best migration approach. Continuously monitor and optimize data workflows for performance, reliability, and cost-effectiveness across Azure platforms. Enforce best practices in data governance, security, and compliance throughout the migration lifecycle. Required Skills and Qualifications: 3+ years of hands-on experience in data engineering , with a clear focus on SQL workload migration to Azure . Deep expertise in: Azure Data Factory / Synapse Data Factory, Azure Data Lake, Azure Databricks / Synapse Spark Pools, Python and PySpark, SQL SSIS – design, development, and migration to Azure Proven track record of delivering complex data migration projects (on-prem to Azure, or cloud-to-cloud). Experience re-platforming or re-engineering SSIS packages for Azure Data Factory or Azure-SSIS Integration Runtime. Microsoft Certified: Azure Data Engineer Associate or similar certification preferred. Strong problem-solving skills, attention to detail, and ability to work in fast-paced environments. Excellent communication skills with the ability to collaborate across teams and present migration strategies to stakeholders. If you believe you are qualified and are looking forward to setting your career on a fast-track, apply by submitting a few paragraphs explaining why you believe you are the right person for this role. To know more about Techolution, visit our website: www.techolution.com If you believe you are qualified and are looking forward to setting your career on a fast-track, apply by submitting a few paragraphs explaining why you believe you are the right person for this role.To know more about Techolution, visit our website: www.techolution.com About Techolution: Techolution is a next gen AI consulting firm on track to become one of the most admired brands in the world for "AI done right". Our purpose is to harness our expertise in novel technologies to deliver more profits for our enterprise clients while helping them deliver a better human experience for the communities they serve. At Techolution, we build custom AI solutions that produce revolutionary outcomes for enterprises worldwide. Specializing in "AI Done Right," we leverage our expertise and proprietary IP to transform operations and help achieve business goals efficiently. We are honored to have recently received the prestigious Inc 500 Best In Business award , a testament to our commitment to excellence. We were also awarded - AI Solution Provider of the Year by The AI Summit 2023, Platinum sponsor at Advantage DoD 2024 Symposium and a lot more exciting stuff! While we are big enough to be trusted by some of the greatest brands in the world, we are small enough to care about delivering meaningful ROI-generating innovation at a guaranteed price for each client that we serve. Our thought leader, Luv Tulsidas, wrote and published a book in collaboration with Forbes, “Failing Fast? Secrets to succeed fast with AI”. Refer here for more details on the content - https://www.luvtulsidas.com/ Let's explore further! Uncover our unique AI accelerators with us: 1. Enterprise LLM Studio : Our no-code DIY AI studio for enterprises. Choose an LLM, connect it to your data, and create an expert-level agent in 20 minutes. 2. AppMod. AI : Modernizes ancient tech stacks quickly, achieving over 80% autonomy for major brands! 3. ComputerVision. AI : Our ComputerVision. AI Offers customizable Computer Vision and Audio AI models, plus DIY tools and a Real-Time Co-Pilot for human-AI collaboration! 4. Robotics and Edge Device Fabrication : Provides comprehensive robotics, hardware fabrication, and AI-integrated edge design services. 5. RLEF AI Platform : Our proven Reinforcement Learning with Expert Feedback (RLEF) approach bridges Lab-Grade AI to Real-World AI. Some videos you wanna watch! Computer Vision demo at The AI Summit New York 2023 Life at Techolution GoogleNext 2023 Ai4 - Artificial Intelligence Conferences 2023 WaWa - Solving Food Wastage Saving lives - Brooklyn Hospital Innovation Done Right on Google Cloud Techolution featured on Worldwide Business with KathyIreland Techolution presented by ION World’s Greatest Visit us @ www.techolution.com : To know more about our revolutionary core practices and getting to know in detail about how we enrich the human experience with technology.
Posted 5 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
HCL Tech - Mega Walkin Drive Bulk Hiring for Freshers _ 28thJuly & 29th July Designation: Customer Service Associate Any Freshers can walk in for the interviews. (Arts and Science, MBA, MA, MSC, M.com Also can apply) 2025 Passed outs can walk in with final semester result copy is mandatory. Educational Qualification: Graduate in any stream. *Engineering graduates will not be considered* Shift: Us Shifts Mode of interview: Walkin Date:28th & 29th July Timing: 11:00AM to 2:00 PM Contact HR: Freddy & Pradeep Work Location: Sholinganallur Interview Location: HCL TECH, Sholinganallur ELCOT campus, Tower 4, Chennai-119 You can also refer your friends for this role. Perks and Benefits : MNC Cab facility (two way) (Upto 30 Kms Only) Salary : great in the industry Excellent working environment Free Cab for female employees International Trainers World class exposure How You'll Grow At HCL Tech, we offer continuous opportunities for you to find your spark and grow with us. We want you to be happy and satisfied with your role and to really learn what type of work sparks your brilliance the best. Throughout your time with us, we offer transparent communication with senior-level employees, learning and career development programs at every level, and opportunities to experiment in different roles or even pivot industries. We believe that you should be in control of your career with unlimited opportunities to find the role that fits you best. Why Us We are one of the fastest-growing large tech companies in the world, with offices in 60+ countries across the globe and 222,000 employees. Our company is extremely diverse with 165 nationalities represented. We offer the opportunity to work with colleagues across the globe. We offer a virtual-first work environment, promoting a good work-life integration and real flexibility. We are invested in your growth, offering learning and career development opportunities at every level to help you find your own unique spark.
Posted 5 days ago
6.0 - 7.0 years
15 - 17 Lacs
India
On-site
About The Opportunity This role is within the fast-paced enterprise technology and data engineering sector, delivering high-impact solutions in cloud computing, big data, and advanced analytics. We design, build, and optimize robust data platforms powering AI, BI, and digital products for leading Fortune 500 clients across industries such as finance, retail, and healthcare. As a Senior Data Engineer, you will play a key role in shaping scalable, production-grade data solutions with modern cloud and data technologies. Role & Responsibilities Architect and Develop Data Pipelines: Design and implement end-to-end data pipelines (ingestion → transformation → consumption) using Databricks, Spark, and cloud object storage. Data Warehouse & Data Mart Design: Create scalable data warehouses/marts that empower self-service analytics and machine learning workloads. Database Modeling & Optimization: Translate logical models into efficient physical schemas, ensuring optimal partitioning and performance management. ETL/ELT Workflow Automation: Build, automate, and monitor robust data ingestion and transformation processes with best practices in reliability and observability. Performance Tuning: Optimize Spark jobs and SQL queries through careful tuning of configurations, indexing strategies, and resource management. Mentorship and Continuous Improvement: Provide production support, mentor team members, and champion best practices in data engineering and DevOps methodology. Skills & Qualifications Must-Have 6-7 years of hands-on experience building production-grade data platforms, including at least 3 years with Apache Spark/Databricks. Expert proficiency in PySpark, Python, and advanced SQL with a record of performance tuning distributed jobs. Proven expertise in data modeling, data warehouse/mart design, and managing ETL/ELT pipelines using tools like Airflow or dbt. Hands-on experience with major cloud platforms such as AWS or Azure, and familiarity with modern lakehouse/data-lake patterns. Strong analytical, problem-solving, and mentoring skills with a DevOps mindset and commitment to code quality. Preferred Experience with AWS analytics services (Redshift, Glue, S3) or the broader Hadoop ecosystem. Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Exposure to streaming pipelines (Kafka, Kinesis, Delta Live Tables) and real-time analytics solutions. Familiarity with ML feature stores, MLOps workflows, or data governance frameworks. Relevant certifications (Databricks, AWS, Azure) or active contributions to open source projects. Location: India | Employment Type: Fulltime Skills: agile methodologies,team leadership,performance tuning,sql,elt,airflow,aws,data modeling,apache spark,pyspark,data,hadoop,databricks,python,dbt,big data technologies,etl,azure
Posted 5 days ago
7.0 years
15 - 17 Lacs
India
Remote
Note: This is a remote role with occasional office visits. Candidates from Mumbai or Pune will be preferred About The Company A fast-growing enterprise technology consultancy operating at the intersection of cloud computing, big-data engineering and advanced analytics . The team builds high-throughput, real-time data platforms that power AI, BI and digital products for Fortune 500 clients across finance, retail and healthcare. By combining Databricks Lakehouse architecture with modern DevOps practices, they unlock insight at petabyte scale while meeting stringent security and performance SLAs. Role & Responsibilities Architect end-to-end data pipelines (ingestion → transformation → consumption) using Databricks, Spark and cloud object storage. Design scalable data warehouses/marts that enable self-service analytics and ML workloads. Translate logical data models into physical schemas; own database design, partitioning and lifecycle management for cost-efficient performance. Implement, automate and monitor ETL/ELT workflows, ensuring reliability, observability and robust error handling. Tune Spark jobs and SQL queries, optimizing cluster configurations and indexing strategies to achieve sub-second response times. Provide production support and continuous improvement for existing data assets, championing best practices and mentoring peers. Skills & Qualifications Must-Have 6–7 years building production-grade data platforms, including 3 years+ hands-on Apache Spark/Databricks experience. Expert proficiency in PySpark, Python and advanced SQL, with a track record of performance-tuning distributed jobs. Demonstrated ability to model data warehouses/marts and orchestrate ETL/ELT pipelines with tools such as Airflow or dbt. Hands-on with at least one major cloud platform (AWS or Azure) and modern lakehouse / data-lake patterns. Strong problem-solving skills, DevOps mindset and commitment to code quality; comfortable mentoring fellow engineers. Preferred Deep familiarity with the AWS analytics stack (Redshift, Glue, S3) or the broader Hadoop ecosystem. Bachelor’s or Master’s degree in Computer Science, Engineering or a related field. Experience building streaming pipelines (Kafka, Kinesis, Delta Live Tables) and real-time analytics solutions. Exposure to ML feature stores, MLOps workflows and data-governance/compliance frameworks. Relevant professional certifications (Databricks, AWS, Azure) or notable open-source contributions. Benefits & Culture Highlights Remote-first & flexible hours with 25+ PTO days and comprehensive health cover. Annual training budget & certification sponsorship (Databricks, AWS, Azure) to fuel continuous learning. Inclusive, impact-focused culture where engineers shape the technical roadmap and mentor a vibrant data community Skills: data modeling,big data technologies,team leadership,aws,data,sql,agile methodologies,performance tuning,elt,airflow,apache spark,pyspark,hadoop,databricks,python,dbt,etl,azure
Posted 5 days ago
0 years
0 Lacs
Pune, Maharashtra, India
Remote
Unlock Your Creative Potential: Join the Revolution in Hair Perfume as a Star Intern! Imagine being at the forefront of a groundbreaking beauty brand that's redefining how the world experiences scent and style—right from the heart of innovation! We're an exciting new hair perfume startup, fusing luxurious fragrances with cutting-edge hair care to create products that turn heads and ignite senses. If you're a visionary creative eager to skyrocket your career, this is your golden ticket to shape a brand destined for stardom. Join our vibrant, fast-paced team and transform ideas into viral sensations—your contributions could be the spark that launches us into the spotlight! Role Highlights: Where Magic Meets Opportunity Position : Creative Marketing & Graphic Design Intern (Your Launchpad to Beauty Industry Fame!) Duration : 3-6 months (flexible scheduling to fit your life part-time or full-time vibes) Location : Mostly remote bliss (work from anywhere!), spiced up with inspiring in-person collabs at trendy Pune cafes once or twice a week—think brainstorming over lattes in the city's coolest spots Perks & Rewards : Attractive stipend (tailored to your skills and experience) + invaluable portfolio boosters, mentorship from beauty trailblazers, and a real shot at a full-time gig in our growing empire Your Epic Missions: Dive into Hands-On Excitement Branding and Design: Craft a jaw-dropping logo that embodies the soul of our hair perfume revolution—make it iconic and unforgettable! Elevate product packaging to luxurious heights, blending elegance, innovation, and irresistible allure that captivates our dream audience. Content Creation: Weave captivating stories through top-tier writing for our website, blog, and promo magic—your words will enchant and convert! Produce scroll-stopping social media posts, reels, and stories that explode on Instagram, TikTok, and Pinterest, building a loyal fanbase and viral momentum. Digital Marketing and SEO: Fuel game-changing strategies to amplify our brand's reach, turning clicks into devoted customers. Master SEO wizardry to dominate search results, supercharge visibility, and position us as the go-to in beauty trends. What You'll Bring: The Spark We're Craving Enrolled in or fresh out of a program in Marketing, Graphic Design, Communications, or something equally awesome. Wizard-level skills in tools like Adobe Creative Suite (Photoshop, Illustrator) or Canva to bring designs to life. A storytelling genius with writing that hooks, engages, and inspires—bonus if you've got a knack for audience vibes. Proven flair for social media sorcery, especially crafting addictive short-form videos that rack up views. Solid grasp of digital marketing essentials and SEO superpowers (think Google Analytics and keyword mastery). A bold, innovative mind with a burning passion for beauty—startup experience? Amazing, but your enthusiasm is the real MVP! Stellar communication, self-drive, and the ability to thrive in an exhilarating, ever-evolving scene. Why This Gig Will Change Your World Gain insider access to launching a brand from zero to hero—your work will be seen by thousands! Personalized guidance from passionate pros, plus a portfolio that'll dazzle future employers. Ultimate flexibility to juggle studies, side hustles, or whatever fuels you. A buzzing, collaborative vibe where your wildest ideas aren't just heard—they're celebrated and executed! Ready to infuse your creativity into the next big thing in beauty and leave your mark on the hair perfume universe? This isn't just an internship—it's your stage to shine! Shoot us your resume, a standout portfolio (designs, writings, or social gems), and a quick pitch on why you're our perfect match at sereluxgemora@gmail.com. Spots are filling fast—seize the moment and apply today!
Posted 5 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Would you like to build highly available, scalable and distributed engineering systems for one of the largest data lakes in Amazon? Does Petabyte scale excite you? The Spektr Datalake team owns the central data lake for Advertising unifying Petabytes of data generated across the Ads pipeline such as campaigns, ad-serving, billing, clicks, impressions and more and into a single scalable repository. This is used across the organization to drive hundreds of thousands of complex queries for analysis, measurement and reporting decisions for our customers. The data lake enables customers such as data engineers, business analysts, ML engineers, research scientists, economists and data experts to collect what they need via world-class self-service tools. Spektr Datalake team is building the next version of its data lake for 5x growth. An SDE on the ADM team has a unique opportunity to design and innovate solutions for this scale, delivering robust and scalable microservices built over Java and AWS as well as innovate with big data technologies like Spark, EMR, Athena and more. You will create value that materially impacts the speed and quality of decision making across the organization resulting in tangible business growth. Key job responsibilities Engage with key decision makers such as Product & Program Managers to understand customer requirements and brainstorm on solutions Design, code and deploy components and micro-services for the core job management pipeline Ensure testability, maintanability and low operational footprint for your code Participate in operational responsibilities with your team Innovate on AWS technology to improve latency, reduce cost and operations A day in the life Focus on core engineering opportunities to guarantee system availability that matches our data growth Work with a skilled team of engineers, managers and decision makers to consistently meet customer demand Automate monitoring of data availability, quality and usability via simplified metrics and drive innovations to improve guarantees for our customers Build frugal solutions that will help make Spektr data lake cost wise leanest datalake in Amazon About The Team The mission of the Spektr Datalake team is to provide data that helps the advertising organization make informed analyses and decisions for our customers and to determine and deploy investments for future growth via a set of central and unified products and services. Our team focuses on simplicity, usability, speed, compliance, cost efficiency and enabling high-velocity decision making so our customers can generate high quality insights faster. We are a global team with presence across IN and NA. Basic Qualifications 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience Experience programming with at least one software programming language Preferred Qualifications 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2990168
Posted 5 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Company Description At UC Brand Labs, we exist for the dreamers, the creators, and the changemakers. Whether you have a spark of an idea or a full-blown vision, we’re here to help you shape it, scale it, and bring it to life. We dedicate ourselves to fostering creativity and innovation in our clients. Join us in our mission to make dreams a reality. Role Description This is a full-time remote role for a Content Writing Intern. The Content Writing Intern will be responsible for creating engaging web content, developing content strategies, managing content, and assisting in communication tasks. The intern will also support writing efforts with a focus on generating high-quality content that aligns with our brand vision. Qualifications Strong Communication skills Web Content Writing and Writing skills Experience in Content Strategy and Content Management A keen eye for detail and creativity Ability to work independently and collaboratively Relevant experience or coursework in English, Journalism, Communications, or related field is a plus
Posted 5 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our vision is to transform how the world uses information to enrich life for all . Micron Technology is a world leader in innovating memory and storage solutions that accelerate the transformation of information into intelligence, inspiring the world to learn, communicate and advance faster than ever. Responsibilities Include, But Not Limited To Strong desire to grow a career as a Data Scientist in highly automated industrial manufacturing doing analysis and machine learning on terabytes and petabytes of diverse datasets. Experience in the areas: statistical modeling, feature extraction and analysis, supervised/unsupervised/semi-supervised learning. Exposure to the semiconductor industry is a plus but not a requirement. Ability to extract data from different databases via SQL and other query languages and applying data cleansing, outlier identification, and missing data techniques. Strong software development skills. Strong verbal and written communication skills. Experience with or desire to learn: Machine learning and other advanced analytical methods Fluency in Python and/or R pySpark and/or SparkR and/or SparklyR Hadoop (Hive, Spark, HBase) Teradata and/or another SQL databases Tensorflow, and/or other statistical software including scripting capability for automating analyses SSIS, ETL Javascript, AngularJS 2.0, Tableau Experience working with time-series data, images, semi-supervised learning, and data with frequently changing distributions is a plus Experience working with Manufacturing Execution Systems (MES) is a plus Existing papers from CVPR, NIPS, ICML, KDD, and other key conferences are plus, but this is not a research position About Micron Technology, Inc. We are an industry leader in innovative memory and storage solutions transforming how the world uses information to enrich life for all . With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micron® and Crucial® brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence and 5G applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience. To learn more, please visit micron.com/careers All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. To request assistance with the application process and/or for reasonable accommodations, please contact hrsupport_india@micron.com Micron Prohibits the use of child labor and complies with all applicable laws, rules, regulations, and other international and industry labor standards. Micron does not charge candidates any recruitment fees or unlawfully collect any other payment from candidates as consideration for their employment with Micron. AI alert : Candidates are encouraged to use AI tools to enhance their resume and/or application materials. However, all information provided must be accurate and reflect the candidate's true skills and experiences. Misuse of AI to fabricate or misrepresent qualifications will result in immediate disqualification. Fraud alert: Micron advises job seekers to be cautious of unsolicited job offers and to verify the authenticity of any communication claiming to be from Micron by checking the official Micron careers website in the About Micron Technology, Inc.
Posted 5 days ago
3.0 years
4 Lacs
Delhi
On-site
Job Description: Hadoop & ETL Developer Location: Shastri Park, Delhi Experience: 3+ years Education: B.E./ B.Tech/ MCA/ MSC (IT or CS) / MS Salary: Upto 80k (rest depends on interview and the experience) Notice Period: Immediate joiner to 20 days of joiners Candidates from Delhi/ NCR will only be preferred Job Summary:- We are looking for a Hadoop & ETL Developer with strong expertise in big data processing, ETL pipelines, and workflow automation. The ideal candidate will have hands-on experience in the Hadoop ecosystem, including HDFS, MapReduce, Hive, Spark, HBase, and PySpark, as well as expertise in real-time data streaming and workflow orchestration. This role requires proficiency in designing and optimizing large-scale data pipelines to support enterprise data processing needs. Key Responsibilities Design, develop, and optimize ETL pipelines leveraging Hadoop ecosystem technologies. Work extensively with HDFS, MapReduce, Hive, Sqoop, Spark, HBase, and PySpark for data processing and transformation. Implement real-time and batch data ingestion using Apache NiFi, Kafka, and Airbyte. Develop and manage workflow orchestration using Apache Airflow. Perform data integration across structured and unstructured data sources, including MongoDB and Hadoop-based storage. Optimize MapReduce and Spark jobs for performance, scalability, and efficiency. Ensure data quality, governance, and consistency across the pipeline. Collaborate with data engineering teams to build scalable and high-performance data solutions. Monitor, debug, and enhance big data workflows to improve reliability and efficiency. Required Skills & Experience : 3+ years of experience in Hadoop ecosystem (HDFS, MapReduce, Hive, Sqoop, Spark, HBase, PySpark). Strong expertise in ETL processes, data transformation, and data warehousing. Hands-on experience with Apache NiFi, Kafka, Airflow, and Airbyte. Proficiency in SQL and handling structured and unstructured data. Experience with NoSQL databases like MongoDB. Strong programming skills in Python or Scala for scripting and automation. Experience in optimizing Spark and MapReduce jobs for high-performance computing. Good understanding of data lake architectures and big data best practices. Preferred Qualifications Experience in real-time data streaming and processing. Familiarity with Docker/Kubernetes for deployment and orchestration. Strong analytical and problem-solving skills with the ability to debug and optimize data workflows. If you have a passion for big data, ETL, and large-scale data processing, we’d love to hear from you! Job Types: Full-time, Contractual / Temporary Pay: From ₹400,000.00 per year Work Location: In person
Posted 5 days ago
5.0 - 9.0 years
3 - 9 Lacs
No locations specified
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Sr Associate IS Architect What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to deliver actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and performing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has deep technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Standup and enhance BI reporting capabilities through Cognos, PowerBI or similar tools. Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementatio What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / Bachelor's degree with 5- 9 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experience in BI reporting tools such as Cognos, PowerBI and/or Tableau Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Experience with Anaplan platform, including building, managing, and optimizing models and workflows including scalable data integrations Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications: AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 5 days ago
8.0 - 13.0 years
3 - 6 Lacs
No locations specified
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let’s do this. Let’s change the world. In this vital role you will work as a member of a Data Platform Engineering team that uses Cloud and Big Data technologies to craft, develop, implement and maintain solutions to support various functions like Manufacturing, Commercial, Research and Development. Roles & Responsibilities: Collaborate with Lead Architect, Business SMEs, and Data Scientists to design data solutions Serve as a Lead Engineer for technical implementation of projects including planning, architecture, design, development, testing, and deployment following agile methodologies Design and development of API services for managing Databricks resources, services & features and to support data governance applications to manage security of data assets following the standards Design and development of enterprise-level re-usable components, frameworks and services to enable data engineers Proactively work on challenging data integration problems by implementing efficient ETL patterns, frameworks for structured and unstructured data Automate and optimize data pipeline and framework for easier and efficient development process Overall management of the Enterprise Data Fabric/Lake on AWS environment to ensure that the service delivery is efficient and business SLAs around uptime, performance and capacity are met Help define guidelines, standards, strategies, security policies and change management policies to support the Enterprise Data Fabric/Lake Advice and support project teams (project managers, architects, business analysts, and developers) on cloud platforms (AWS, Databricks preferred), tools, technology, and methodology related to the design, build scalable, efficient and maintain Data Lake and other Big Data solutions Experience developing in an Agile development environment and ceremonies Familiarity with code versioning using GITLAB, and code deployment tools Mentor junior engineers and team members What we expect of you Basic Qualifications Doctorate degree / Master's degree / Bachelor's degree and 8 to 13 years in Computer Science or Engineering Must-Have Skills: Proven hands-on experience with cloud platforms—AWS (preferred), Azure, or GCP. Strong development experience with Databricks, Apache Spark, PySpark, and Apache Airflow. Proficiency in Python-based microservices development and deployment. Experience with CI/CD pipelines, containerization (Docker, Kubernetes/EKS), and infrastructure-as-code tools. Demonstrated ability to build enterprise-grade, performance-optimized data pipelines in Databricks using Python and PySpark, following best practices and standards. Solid understanding of SQL and relational/dimensional data modelling techniques. Strong analytical and problem-solving skills to address complex data engineering challenges. Familiarity with software engineering standard methodologies, including version control, automated testing, and continuous integration. Hands-on experience with key AWS services: EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, and Glue. Exposure to Agile tools such as Jira or Jira Align. Good-to-Have Skills: Experience building APIs and services for provisioning and managing AWS Databricks environments. Knowledge of Databricks SDK and REST APIs for managing workspaces, clusters, jobs, users, and permissions. Familiarity with building AI/ML solutions using Databricks-native features. Experience working with SQL/NoSQL databases and vector databases for large language model (LLM) applications. Exposure to model fine-tuning and timely engineering practices. Experience developing self-service portals using front-end frameworks like React.js. Ability to thrive in startup-like environments with minimal direction. Good communication skills to effectively present technical information to leadership and respond to collaborator inquiries. Certifications (preferred but not required): AWS Certified Data Engineer Databricks Certification SAFe Agile Certification Soft Skills: Strong analytical and problem-solving attitude with the ability to troubleshoot sophisticated data and platform issues. Exceptional communication skills—able to translate technical concepts into clear, business-relevant language for diverse audiences. Collaborative and globally minded, with experience working effectively in distributed, multi-functional teams. Self-motivated and proactive, demonstrating a high degree of ownership and initiative in driving tasks to completion. Skilled at managing multiple priorities in fast-paced environments while maintaining attention to detail and quality. Team-oriented with a growth mindset, contributing to shared goals and fostering a culture of continuous improvement. Effective time and task management, with the ability to estimate, plan, and deliver work across multiple projects while ensuring consistency and quality What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 5 days ago
5.0 - 9.0 years
4 - 8 Lacs
No locations specified
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do As a Sr. Associate IS Security Engineer at Amgen, you will play a critical role in ensuring the security and protection of the company's information systems and data. You will implement security measures, conduct security audits, analyze security incidents, and provide recommendations for improvements. Your strong knowledge of security protocols, network infrastructure, and vulnerability assessment will contribute to maintaining a secure IT environment. Roles & Responsibilities: Apply patches, perform OS upgrades, manage platform end-of-life. Perform annual audits and periodic compliance reviews. Support GxP validation and documentation processes. Monitor and respond to security incidents. Correlate alerts across platforms for threat detection. Improve procedures through post-incident analysis. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / Bachelor's degree and 5 to 9years of Computer Science, IT or related field experience. Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processiSolid understanding of security technologies and their core functionality Experience in analyzing cybersecurity threats with up-to-date knowledge of attack vectors and the cyber threat landscape. Ability to prioritize tasks effectively and solve problems efficiently in a diverse, global team environment. Good knowledge of Windows and/or Linux systems. Experience with security alert correlation across different platforms. Experience with ServiceNow, especially CMDB, Common Service Data Model (CSDM) and IT Service Management. SQL & Database Knowledge – Experience working with relational databases, querying data, and optimizing datasets. Preferred Qualifications: Familiarity with Cloud services like AWS (e.g., Redshift, S3, EC2, IAM ), Databricks (Deltalake, Unity catalog, token etc) Understanding of Agile methodologies (Scrum, SAFe) Knowledge of DevOps, CI/CD practices Familiarity with scientific or healthcare data domains Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 5 days ago
5.0 - 9.0 years
5 - 7 Lacs
No locations specified
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do The Sr Associate Software Engineer is responsible for designing, developing, and maintaining software applications and solutions that meet business needs and ensuring the availability and performance of critical systems and applications. This role involves working closely with product managers, designers, data engineers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code Contribute to both front-end and back-end development using cloud technology Develop innovative solution using generative AI technologies Ensure code quality and adherence to best practices Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team, and other stakeholders Design, develop, and implement applications and modules, including custom reports, interfaces, and enhancements Analyze and understand the functional and technical requirements of applications, solutions and systems and translate them into software architecture and design specifications Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software Identify and resolve software bugs and performance issues Work closely with cross-functional teams, including product management, design, and QA, to deliver high-quality software on time Customize modules to meet specific business requirements Work on integrating with other systems and platforms to ensure seamless data flow and functionality Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / Bachelor's degree and 5 to 9 years of Computer Science, IT or related field experience. Must-Have Skills: Proficiency in Python/PySpark development, Flask/Fast API, C#, ASP.net, PostgreSQL, Oracle, Databricks, DevOps Tools, CI/CD, Data Ingestion. Candidates should be able to write clean, efficient, and maintainable code. Knowledge of HTML, CSS, and JavaScript, along with popular front-end frameworks like React or Angular, is required to build interactive and responsive web applications In-depth knowledge of data engineering concepts, ETL processes, and data architecture principles. Strong understanding of cloud computing principles, particularly within the AWS ecosystem Strong understanding of software development methodologies, including Agile and Scrum Experience with version control systems like Git Hands on experience with various cloud services, understand pros and cons of various cloud service in well architected cloud design principles Strong problem solving, analytical skills; Ability to learn quickly; Excellent communication and interpersonal skills Experienced with API integration, serverless, microservices architecture. Experience in SQL/NOSQL database, vector database for large language models Preferred Qualifications: Strong understanding of cloud platforms (e.g., AWS, GCP, Azure) and containerization technologies (e.g., Docker, Kubernetes) Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk) Experience with data processing tools like Spark, or similar Experience with SAP integration technologies Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 5 days ago
5.0 - 9.0 years
7 - 8 Lacs
Hyderābād
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do As a Data Engineer, you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / Bachelor's degree and 5 to 9 years of Computer Science, IT or related field experience. Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong knowledge on Oracle / SQL Server, Stored Procedure, PL/SQL, Knowledge on LINUX OS Knowledge on Data visualization and analytics tools like Spotfire, PowerBI Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: Databricks Certificate preferred AWS Data Engineer/Architect Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 5 days ago
5.0 years
7 - 10 Lacs
Pune
On-site
Position Title - Data Engineer About Advanced Energy Advanced Energy Industries, Inc. (NASDAQ: AEIS), enables design breakthroughs and drives growth for leading semiconductor and industrial customers. Our precision power and control technologies, along with our applications know-how, inspire close partnerships and innovation in thin-film and industrial manufacturing. We are proud of our rich heritage, award-winning technologies, and we value the talents and contributions of all Advanced Energy's employees worldwide. Responsibilities: Work with business to understand their current state architecture and contributing data sources, technologies, interfaces, performance issues and system configurations. Work closely with the team across the overall data warehousing program including data acquisition, data curation and data syndication. Apply knowledge of the Microsoft platform tooling involved in successful Azure implementations. Serve as the subject matter expert with respect to Azure, cloud applications and system administration best practices. Experience designing data lakes, database schemas and data models for large scale data platform implementation on Azure. Develop system integrations using, Azure Logic Apps, Integration services, Power Automate, PowerApps, etc. Perform project management activities including project documentation, business requirements and project tracking. Ensuring data quality and consistency through data cleaning, transformation, and integration processes. Monitoring and troubleshooting data-related issues within the Azure environment to maintain high availability and performance. Collaborating with data scientists, business analysts, and other stakeholders to understand data requirements and implement appropriate data solutions. Implementing data security measures, including encryption, access controls, and auditing, to protect sensitive information. Automating data pipelines and workflows to streamline data ingestion, processing, and distribution tasks. Utilizing Azure's analytics services, such as Azure Synapse Analytics, to provide insights and support data-driven decision-making. Keeping abreast of the latest Azure features and technologies to enhance data engineering processes and capabilities. Documenting data procedures, systems, and architectures to maintain clarity and ensure compliance with regulatory standards. Providing guidance and support for data governance, including metadata management, data lineage, and data cataloguing. Tech Stack: Python SQL and NoSQL databases Scala Spark-SQL Experience with Azure: ADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field Minimum of 5+ years of data and analytics field experience in the industries like manufacturing, engineering, supply chain with large-scale implementation experience. 3-5 years in Data pipeline engineering role in Data and Analytics scope. Working knowledge of Azure Analytics features such as Stream Analytics, Machine Learning and Application Insights. Modelling and ETL knowledge with Erwin, Azure Databricks, Data Factory, SSIS and Azure Synapse. Develop and maintain system integrations using, Azure Logic Apps, Integration Services, Power Automate, PowerApps, etc. Microsoft Azure data platform experience with Power BI, Azure Data Lake, data warehouse and Data Factory. Knowledge on migrating on-premises applications to Azure PaaS and Azure LaaS. Attention to detail and ability to coordinate multiple tasks, set priorities and meet deadlines Should be well versed with Data Structures & algorithms Excellent analytical and problem-solving skills. Ability to work independently as a self-starter, and within a team environment. Good Communication skills- Written and Verbal As part of our total rewards philosophy, we believe in offering and maintaining competitive compensation and benefits programs for our employees in order to attract and retain a talented, highly engaged workforce. Our compensation programs are focused on equitable, fair pay practices including market-based base pay, an annual pay-for-performance incentive plan, we offer a strong benefits package in each of the countries in which we operate. Advanced Energy is committed to diversity in its workforce including Equal Employment Opportunity for Minorities, Females, Protected Veterans, and Individuals with Disabilities. We are committed to protecting and respecting your privacy. We take your privacy seriously and will only use your personal information to administer your application in accordance with the RA No. 10173 also known as the Data Privacy Act of 2012.
Posted 5 days ago
8.0 years
5 - 10 Lacs
Bengaluru
On-site
We help the world run better At SAP, we enable you to bring out your best. Our company culture is focused on collaboration and a shared passion to help the world run better. How? We focus every day on building the foundation for tomorrow and creating a workplace that embraces differences, values flexibility, and is aligned to our purpose-driven and future-focused work. We offer a highly collaborative, caring team environment with a strong focus on learning and development, recognition for your individual contributions, and a variety of benefit options for you to choose from. What you'll do: You will take an ownership of designing and building core integration frameworks that enable real-time, event-driven data flows between distributed SAP systems. As a senior contributor, you will work closely with architects to drive end-to-end development of services and pipelines supporting distributed data processing, data transformations and intelligent automation. This is an unique opportunity to contribute to SAP’s evolving data platform initiatives with hands-on involvement in Java, Python, Kafka, DevOps, Real-Time Analytics, Intelligent Monitoring, BTP and Hyperscaler ecosystems. Responsibilities: Design and develop Micro services using Java, RESTful APIs and messaging frameworks such as Apache Kafka. Design and Develop UI based on SAP UI5/Fiori is a plus Design and Develop Observability Framework for Customer Insights Build and maintain scalable data processing and ETL pipelines that support real-time and batch data flows. Experience with Databricks is an advantage. Accelerate the App2App integration roadmap by identifying reusable patterns, driving platform automation and establishing best practices. Collaborate with cross-functional teams to enable secure, reliable and performant communication across SAP applications. Build and maintain distributed data processing pipelines, supporting large-scale data ingestion, transformation and routing. Work closely with DevOps to define and improve CI/CD pipelines, monitoring and deployment strategies using modern GitOps practices. Guide cloud-native secure deployment of services on SAP BTP and major Hyperscaler (AWS, Azure, GCP). Collaborate with SAP’s broader Data Platform efforts including Datasphere, SAP Analytics Cloud and BDC runtime architecture. Ensure adherence to best practices in microservices architecture, including service discovery, load balancing, and fault tolerance. Stay updated with the latest industry trends and technologies to continuously improve the development process What you bring: 8+ years of hands-on experience in backend development using Java, with strong object-oriented design and integration patterns. Hands-on experience building ETL pipelines and working with large-scale data processing frameworks. Exposure to Log Aggregator Tools like Splunk, ELK , etc. Experience or experimentation with tools such as Databricks, Apache Spark or other cloud-native data platforms is highly advantageous. Familiarity with SAP Business Technology Platform (BTP), SAP Datasphere, SAP Analytics Cloud or HANA is highly desirable. Design CI/CD pipelines, containerization (Docker), Kubernetes and DevOps best practices. Working knowledge of Hyperscaler environments such as AWS, Azure or GCP. Passionate about clean code, automated testing, performance tuning and continuous improvement. Strong communication skills and ability to collaborate with global teams across time zones. Meet your Team SAP is the market leader in enterprise application software, helping companies of all sizes and industries run at their best. As part of the Business Data Cloud (BDC) organization, the Foundation Services team is pivotal to SAP’s Data & AI strategy, delivering next-generation data experiences that power intelligence across the enterprise. Located in Bangalore, India, our team drives cutting-edge engineering efforts in a collaborative, inclusive and high-impact environment, enabling innovation and integration across SAP’s data platform #DevT3 Bring out your best SAP innovations help more than four hundred thousand customers worldwide work together more efficiently and use business insight more effectively. Originally known for leadership in enterprise resource planning (ERP) software, SAP has evolved to become a market leader in end-to-end business application software and related services for database, analytics, intelligent technologies, and experience management. As a cloud company with two hundred million users and more than one hundred thousand employees worldwide, we are purpose-driven and future-focused, with a highly collaborative team ethic and commitment to personal development. Whether connecting global industries, people, or platforms, we help ensure every challenge gets the solution it deserves. At SAP, you can bring out your best. We win with inclusion SAP’s culture of inclusion, focus on health and well-being, and flexible working models help ensure that everyone – regardless of background – feels included and can run at their best. At SAP, we believe we are made stronger by the unique capabilities and qualities that each person brings to our company, and we invest in our employees to inspire confidence and help everyone realize their full potential. We ultimately believe in unleashing all talent and creating a better and more equitable world. SAP is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you are interested in applying for employment with SAP and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to Recruiting Operations Team: Careers@sap.com For SAP employees: Only permanent roles are eligible for the SAP Employee Referral Program, according to the eligibility rules set in the SAP Referral Policy. Specific conditions may apply for roles in Vocational Training. EOE AA M/F/Vet/Disability: Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, ethnicity, age, gender (including pregnancy, childbirth, et al), sexual orientation, gender identity or expression, protected veteran status, or disability. Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 430165 | Work Area: Software-Design and Development | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid.
Posted 5 days ago
4.0 - 6.0 years
3 - 4 Lacs
Bengaluru
On-site
What we offer Our company culture is focused on helping our employees enable innovation by building breakthroughs together. How? We focus every day on building the foundation for tomorrow and creating a workplace that embraces differences, values flexibility, and is aligned to our purpose-driven and future-focused work. We offer a highly collaborative, caring team environment with a strong focus on learning and development, recognition for your individual contributions, and a variety of benefit options for you to choose from. Apply now! What you'll do: We are seeking a hands-on Product Manager with strong technical acumen and a passion for data engineering to drive the evolution of our data foundation capabilities. In this role, you will work closely with engineering, architecture, design, and go-to-market teams to define product requirements, shape roadmap priorities, and deliver impactful services that power the BDC platform. You will bring customer empathy, execution focus, and a collaborative mindset to ensure delivery of valuable outcomes for both internal and external stakeholders. Product Development & Execution Define and manage product requirements and use cases based on customer needs, stakeholder inputs, and technical feasibility Partner with engineering teams to deliver high-quality features on time and with measurable impact Prioritize and manage the product backlog, balancing short-term iterations with long-term strategic goals Support the creation of clear documentation, release notes, and user-facing communication Data-Driven Insights Use data, and user feedback to continuously improve product features and drive customer value Collaborate with teams to monitor adoption, measure impact, and identify opportunities Cross-Functional Collaboration Facilitate productive working relationships across BDC, SAP LOBs, and external partners Ensure alignment between technical teams and business stakeholders on product objectives Customer & Stakeholder Engagement Gather feedback directly from internal users, partners, and customers to validate hypotheses and inform future development Participate in customer calls, demos, and workshops to showcase capabilities and understand evolving needs What you bring: Experience: 4–6 years of product management experience in data engineering, platform, data integration or cloud services environments Technical Expertise: Strong background in data engineering, including hands-on experience with ETL, data pipelines, databases, and analytics platforms. Knowledge of Apache Spark, data lake, delta lake, cloud data warehouse, Object store technologies, and experience in building APIs for data sharing using “zero copy share” techniques such as Delta and Iceberg is highly desired. Customer Focus: Proven ability to translate user needs into product requirements and iterate quickly on feedback Execution Skills: Strong organizational, collaboration, interpersonal and planning skills with a bias toward action and delivery Communication Skills: Strong written and verbal communication skills, with the ability to articulate complex ideas clearly and effectively to both technical and non-technical audiences Education: Bachelor’s degree in Computer Science, Engineering, Data Science or related field. Advanced degree or MBA is a plu Meet your Team: SAP Business Data Cloud (BDC) is SAP’s next-generation data platform that brings together data from SAP and non-SAP sources into a unified, open, and business-ready environment. BDC enables organizations to harness the full power of their data with seamless integration, rich semantic context, and advanced governance capabilities. By providing trusted and connected data across landscapes, BDC empowers users to make better, faster, and more confident decisions. BDC Data Foundation Services is a forward-looking team at the heart of SAP’s Business Data Cloud (BDC) mission. We focus on building scalable, robust, and secure data product infrastructure services that empower customers with trusted, unified, and actionable data. As part of our growth journey, we are looking for a skilled and motivated Product Manager to join our team and contribute to the next wave of innovation in data foundations. We are SAP SAP innovations help more than 400,000 customers worldwide work together more efficiently and use business insight more effectively. Originally known for leadership in enterprise resource planning (ERP) software, SAP has evolved to become a market leader in end-to-end business application software and related services for database, analytics, intelligent technologies, and experience management. As a cloud company with 200 million users and more than 100,000 employees worldwide, we are purpose-driven and future-focused, with a highly collaborative team ethic and commitment to personal development. Whether connecting global industries, people, or platforms, we help ensure every challenge gets the solution it deserves. At SAP, we build breakthroughs, together. Our inclusion promise SAP’s culture of inclusion, focus on health and well-being, and flexible working models help ensure that everyone – regardless of background – feels included and can run at their best. At SAP, we believe we are made stronger by the unique capabilities and qualities that each person brings to our company, and we invest in our employees to inspire confidence and help everyone realize their full potential. We ultimately believe in unleashing all talent and creating a better and more equitable world. SAP is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you are interested in applying for employment with SAP and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to Recruiting Operations Team: Americas: Careers.NorthAmerica@sap.com or Careers.LatinAmerica@sap.com, APJ: Careers.APJ@sap.com, EMEA: Careers@sap.com. EOE AA M/F/Vet/Disability: Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, ethnicity, age, gender (including pregnancy, childbirth, et al), sexual orientation, gender identity or expression, protected veteran status, or disability. Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID:430237 | Work Area: Solution and Product Management | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time |
Posted 5 days ago
3.0 years
1 - 2 Lacs
Bengaluru
On-site
JOB DESCRIPTION Be part of a dynamic team where your distinctive skills will contribute to a winning culture and team. As a Applied AI ML Senior Associate at JPMorgan Chase within the Asset and Wealth Management , you serve as a seasoned member of an agile team to design and deliver trusted data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. You are responsible for developing, testing, and maintaining critical data pipelines and architectures across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Supports and develops an understanding of key business problems and processes. Advises a model development process, execute tasks including data wrangling/analysis, model training, testing, and selection. Strategically, implement optimization strategies to fine-tune generative models for specific NLP use cases, ensuring high-quality outputs in summarization and text generation. Updates logically and conducts evaluations of generative models (e.g., GPT-4), iterate on model architectures, and implement improvements to enhance overall performance in NLP applications. Implements monitoring mechanisms to track model performance and ensure model reliability. Frequently communicates AI/ML/LLM/GenAI capabilities and results to both technical and non-technical audiences. From data analysis and modeling exercises, generate structured and meaningful insights and present them in an appropriate format according to the audience. Collaboratively, work with other data scientists and machine learning engineers to deploy machine learning solutions. As required by the business stakeholder, model risk function, and other groups, carry out ad-hoc and periodic analysis. Adds to team culture of diversity, opportunity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification in applied AI/ML concepts and 3+ years applied experience Proficiency in programming languages like Python for model development, experimentation, and integration with OpenAI API. Experience with machine learning frameworks, libraries, and APIs, such as TensorFlow, PyTorch, Scikit-learn, and OpenAI API. Experience in building AI/ML models on structured and unstructured data along with model explainability and model monitoring. Solid understanding of fundamentals of statistics, machine learning (e.g., classification, regression, time series, deep learning, reinforcement learning), and generative model architectures, particularly GANs, VAEs. Experience with a broad range of analytical toolkits, such as SQL, Spark, Scikit-Learn, and XGBoost. Experience with graph analytics and neural networks (PyTorch). Excellent problem-solving, communication (verbal and written), and teamwork skills. Preferred qualifications, capabilities, and skills Expertise in building AI/ML models on structured and unstructured data along with model explainability and model monitoring. Expertise in designing and implementing pipelines using Retrieval-Augmented Generation (RAG). Familiarity with the financial services industry. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM J.P. Morgan Asset & Wealth Management delivers industry-leading investment management and private banking solutions. Asset Management provides individuals, advisors and institutions with strategies and expertise that span the full spectrum of asset classes through our global network of investment professionals. Wealth Management helps individuals, families and foundations take a more intentional approach to their wealth or finances to better define, focus and realize their goals.
Posted 5 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Summary About Guidewire Guidewire is the platform P&C insurers trust to engage, innovate, and grow efficiently. We combine digital, core, analytics, and AI to deliver our platform as a cloud service. More than 540+ insurers in 40 countries, from new ventures to the largest and most complex in the world, run on Guidewire. As a partner to our customers, we continually evolve to enable their success. We are proud of our unparalleled implementation track record with 1600+ successful projects, supported by the largest R&D team and partner ecosystem in the industry. Our Marketplace provides hundreds of applications that accelerate integration, localization, and innovation. Guidewire Software, Inc. is proud to be an equal opportunity and affirmative action employer. We are committed to an inclusive workplace, and believe that a diversity of perspectives, abilities, and cultures is a key to our success. Qualified applicants will receive consideration without regard to race, color, ancestry, religion, sex, national origin, citizenship, marital status, age, sexual orientation, gender identity, gender expression, veteran status, or disability. All offers are contingent upon passing a criminal history and other background checks where it's applicable to the position. Responsibilities Job Description Design and Development: Design, and develop robust, scalable, and efficient data pipelines. Design and manage platform solutions to support data engineering needs to ensure seamless integration and performance. Write clean, efficient, and maintainable code. Data Management and Optimization: Ensure data quality, integrity, and security across all data pipelines. Optimize data processing workflows for performance and cost-efficiency. Develop and maintain comprehensive documentation for data pipelines and related processes. Innovation and Continuous Improvement: Stay current with emerging technologies and industry trends in big data and cloud computing. Propose and implement innovative solutions to improve data processing and analytics capabilities. Continuously evaluate and improve existing data infrastructure and processes. Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 5+ years of experience in software engineering with a focus on data engineering and building data platform Strong programming experience using Python or Java. Proven experience with Big data technologies like Apache Spark, Amazon EMR, Apache Iceberg, Amazon Redshift, etc or Similar technologies Proven experience in RDBMS(Postgres, MySql, etc) and NoSQL(MongoDB, DynamoDB, etc) database Proficient in AWS cloud services (e.g., Lambda, S3, Athena, Glue) or comparable cloud technologies. In-depth understanding of SDLC best practices, including Agile methodologies, code reviews, and CI/CD. Experience working in Event driven and Serverless Architecture Experience with platform solutions and containerization technologies (e.g., Docker, Kubernetes). Excellent problem-solving skills and the ability to work in a fast-paced, dynamic environment. Strong communication skills, both written and verbal. Why Join Us Opportunity to work with cutting-edge technologies and innovative projects. Collaborative and inclusive work environment. Competitive salary and benefits package. Professional development opportunities and career growth. About Guidewire Guidewire is the platform P&C insurers trust to engage, innovate, and grow efficiently. We combine digital, core, analytics, and AI to deliver our platform as a cloud service. More than 540+ insurers in 40 countries, from new ventures to the largest and most complex in the world, run on Guidewire. As a partner to our customers, we continually evolve to enable their success. We are proud of our unparalleled implementation track record with 1600+ successful projects, supported by the largest R&D team and partner ecosystem in the industry. Our Marketplace provides hundreds of applications that accelerate integration, localization, and innovation. For more information, please visit www.guidewire.com and follow us on Twitter: @Guidewire_PandC. Guidewire Software, Inc. is proud to be an equal opportunity and affirmative action employer. We are committed to an inclusive workplace, and believe that a diversity of perspectives, abilities, and cultures is a key to our success. Qualified applicants will receive consideration without regard to race, color, ancestry, religion, sex, national origin, citizenship, marital status, age, sexual orientation, gender identity, gender expression, veteran status, or disability. All offers are contingent upon passing a criminal history and other background checks where it's applicable to the position.
Posted 5 days ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We're Celonis, the global leader in Process Mining technology and one of the world's fastest-growing SaaS firms. We believe there is a massive opportunity to unlock productivity by placing data and intelligence at the core of business processes - and for that, we need you to join us. About Celonis Garage: Celonis Garage operates as an independent research and development unit within Celonis, dedicated to pioneering new business models, exploring emerging technologies, and developing prototypes that enhance the Celonis platform. Our team drives customer co-innovations, scales breakthrough solutions, and pushes the boundaries of process intelligence. Role Overview: We are seeking a highly skilled and experienced Staff Engineer to join our innovative and fast-paced team at Celonis Garage. The ideal candidate will have deep expertise in Software Engineering, Integration, Data, and AI . This role involves designing and implementing cutting-edge solutions that drive the future of process intelligence. Key Responsibilities: Lead the design and architecture of innovative solutions and prototypes that enhance the Celonis platform and co-innovation with customers. Collaborate with cross-functional teams to drive customer co-innovations and breakthrough solutions. Solutioning, prototyping, and developing Proof of Concepts (PoCs) while quickly iterating based on feedback. Explore and integrate emerging technologies to push the boundaries of process intelligence. Develop and maintain architectural standards and best practices. Ensure the scalability, performance, and security of solutions. Mentor and guide junior team members in architectural principles and practices. Engage with stakeholders to identify market gaps, commercial opportunities, and translate them into technical solutions. Qualifications: Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field. 10+ years of experience as a Software Engineer, Solution Architect, Software Architect, or a similar role. Strong background in Software Engineering, Integration, Data, and AI. Expertise in designing and implementing scalable and secure architectures. Proficiency in one or more - Java, Python, JavaScript, and TypeScript. Experience with Spring Boot for backend development. Experience with React / Angular for frontend development. Hands-on experience in AI – particularly Generative AI, RAGs, and Agents. Experience with cloud platforms (AWS, Google Cloud, Azure) in a cloud-agnostic environment. Experience in developing event-driven / streaming applications using Kafka / spark / Flink Proficiency in SQL and an understanding of other database technologies such as NoSQL, Oracle, and MongoDB. Experience with Databricks for data engineering and analytics workloads. Expertise in containerization technologies and orchestration tools, including Docker and Kubernetes. Experience with CI/CD tools such as Jenkins, GitHub Actions, and Maven for deployment and distribution automation. Experience working with distributed caching technologies such as Hazelcast and Redis. Excellent problem-solving skills and the ability to think critically and creatively. Strong communication and collaboration skills. Preferred Qualifications: Experience in Process Intelligence or related domains. Familiarity with the Celonis platform and its capabilities. Certifications in relevant technologies or architectural frameworks. What Celonis Can Offer You: Pioneer Innovation: Work with the leading, award-winning process mining technology, shaping the future of business. Accelerate Your Growth: Benefit from clear career paths, internal mobility, a dedicated learning program, and mentorship opportunities. Receive Exceptional Benefits: Including generous PTO, hybrid working options, company equity (RSUs), comprehensive benefits, extensive parental leave, dedicated volunteer days, and much more. Interns and working students explore your benefits here. Prioritize Your Well-being: Access to resources such as gym subsidies, counseling, and well-being programs. Connect and Belong: Find community and support through dedicated inclusion and belonging programs. Make Meaningful Impact: Be part of a company driven by strong values that guide everything we do: Live for Customer Value, The Best Team Wins, We Own It, and Earth Is Our Future. Collaborate Globally: Join a dynamic, international team of talented individuals. Empowered Environment: Contribute your ideas in an open culture with autonomous teams. About Us: Celonis makes processes work for people, companies and the planet. The Celonis Process Intelligence Platform uses industry-leading process mining and AI technology and augments it with business context to give customers a living digital twin of their business operation. It’s system-agnostic and without bias, and provides everyone with a common language for understanding and improving businesses. Celonis enables its customers to continuously realize significant value across the top, bottom, and green line. Celonis is headquartered in Munich, Germany, and New York City, USA, with more than 20 offices worldwide. Get familiar with the Celonis Process Intelligence Platform by watching this video. Celonis Inclusion Statement: At Celonis, we believe our people make us who we are and that “The Best Team Wins”. We know that the best teams are made up of people who bring different perspectives to the table. And when everyone feels included, able to speak up and knows their voice is heard - that's when creativity and innovation happen. Your Privacy: Any information you submit to Celonis as part of your application will be processed in accordance with Celonis’ Accessibility and Candidate Notices By submitting this application, you confirm that you agree to the storing and processing of your personal data by Celonis as described in our Privacy Notice for the Application and Hiring Process. Please be aware of common job offer scams, impersonators and frauds. Learn more here.
Posted 5 days ago
3.0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
The Data Engineer will serve as a technical expert in the fields ofdesign and develop AI data pipelines to manage both large unstructuredand structured datasets, with a focus on building data pipelines for enterprise AI solutions Job Description In your new role you will: Working closely with data scientists and domain experts to design anddevelop AI data pipelines using agile development process. Developing pipelines for ingesting and processing large unstructuredand structured datasets from a variety of sources, with a specificemphasis on creating solutions for AI solutions to ensure efficient andeffective data processing. Work efficiently with structured and unstructured data sources. Work with cloud technologies such as AWS to design and implement scalable data architectures Supporting the operation of the data pipelines involves troubleshooting and bug fixing, as well as implementing change requeststo ensure that the data pipelines continue to meet user requirements. Your Profile You are best equipped for this task if you have: Masters or Bachelor’s Degree in Computer Science/Mathematics/Statistics or equivalent. Minimum of 3 years of relevant work experience in data engineering Extensive hands-on experience in conceptualizing, designing, andimplementing data pipelines. Proficiency in handling structured dataOracle unstructured data formats (e.g., PPT, PDF, Docx), databases(RDMS, Oracle/PL SQL, MySQL, NoSQL such as Elasticsearch, MongoDB,Neo4j, CEPH) and familiarity with big data platforms (HDFS, Spark,Impala). Experience in working with AWS technologies focussing on buildingscalable data pipelines. Strong background in Software Engineering & Development cycles(CI/CD) with proficiency in scripting languages, particularly Python. Good understanding and experience with Kubernetes / OpenshiftPlatform. Front-end Reporting & Dashboard and Data Exploration tools –Tableau Good understanding of data management, data governance, and datasecurity practices. Highly motivated, structured and methodical with high degree ofself-initiative Team player with good cross-cultural skills to work in internationalteam Customer and Result orientated #WeAreIn for driving decarbonization and digitalization. As a global leader in semiconductor solutions in power systems and IoT, Infineon enables game-changing solutions for green and efficient energy, clean and safe mobility, as well as smart and secure IoT. Together, we drive innovation and customer success, while caring for our people and empowering them to reach ambitious goals. Be a part of making life easier, safer and greener. Are you in? We are on a journey to create the best Infineon for everyone. This means we embrace diversity and inclusion and welcome everyone for who they are. At Infineon, we offer a working environment characterized by trust, openness, respect and tolerance and are committed to give all applicants and employees equal opportunities. We base our recruiting decisions on the applicant´s experience and skills. Please let your recruiter know if they need to pay special attention to something in order to enable your participation in the interview process. Click here for more information about Diversity & Inclusion at Infineon.
Posted 5 days ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About the role Refer to responsibilities You will be responsible for Job Summary: Build solutions for the real-world problems in workforce management for retail. You will work with a team of highly skilled developers and product managers throughout the entire software development life cycle of the products we own. In this role you will be responsible for designing, building, and maintaining our big data pipelines. Your primary focus will be on developing data pipelines using available tec hnologies. In this job, I’m accountable for: Following our Business Code of Conduct and always acting with integrity and due diligence and have these specific risk responsibilities: -Represent Talent Acquisition in all forums/ seminars pertaining to process, compliance and audit -Perform other miscellaneous duties as required by management -Driving CI culture, implementing CI projects and innovation for withing the team -Design and implement scalable and reliable data processing pipelines using Spark/Scala/Python &Hadoop ecosystem. -Develop and maintain ETL processes to load data into our big data platform. -Optimize Spark jobs and queries to improve performance and reduce processing time. -Working with product teams to communicate and translate needs into technical requirements. -Design and develop monitoring tools and processes to ensure data quality and availability. -Collaborate with other teams to integrate data processing pipelines into larger systems. -Delivering high quality code and solutions, bringing solutions into production. -Performing code reviews to optimise technical performance of data pipelines. -Continually look for how we can evolve and improve our technology, processes, and practices. -Leading group discussions on system design and architecture. -Manage and coach individuals, providing regular feedback and career development support aligned with business goals. -Allocate and oversee team workload effectively, ensuring timely and high-quality outputs. -Define and streamline team workflows, ensuring consistent adherence to SLAs and data governance practices. -Monitor and report key performance indicators (KPIs) to drive continuous improvement in delivery efficiency and system uptime. -Oversee resource allocation and prioritization, aligning team capacity with project and business demands. Key people and teams I work with in and outside of Tesco: People, budgets and other resources I am accountable for in my job: TBS & Tesco Senior Management TBS Reporting Team Tesco UK / ROI/ Central Europe Any other accountabilities by the business Business stakeholders Operational skills relevant for this job: Experience relevant for this job: Skills: ETL, YARN,Spark, Hive,Hadoop,PySpark/Python • 7+ years of experience inbuilding and maintaining big data (anyone) Linux/Unix/Shell environments(anyone), Query platforms using Spark/Scala. optimisation • Strong knowledge of distributed computing principles and big Good to have: Kafka, restAPI/reporting tools. data technologies such as Hadoop, Spark, Streaming etc. • Experience with ETL processes and data modelling. • Problem-solving and troubleshooting skills. • Working knowledge on Oozie/Airflow. • Experience in writing unit test cases, shell scripting. • Ability to work independently and as part of a team in a fast-paced environment. You will need Refer to responsibilities Whats in it for you? At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles - simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Performance Bonus - Opportunity to earn additional compensation bonus based on performance, paid annually Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company’s policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. About Us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues. Tesco Business Solutions: Established in 2017, Tesco Business Solutions (TBS) has evolved from a single entity traditional shared services in Bengaluru, India (from 2004) to a global, purpose-driven solutions-focused organisation. TBS is committed to driving scale at speed and delivering value to the Tesco Group through the power of decision science. With over 4,400 highly skilled colleagues globally, TBS supports markets and business units across four locations in the UK, India, Hungary, and the Republic of Ireland. The organisation underpins everything that the Tesco Group does, bringing innovation, a solutions mindset, and agility to its operations and support functions, building winning partnerships across the business. TBS's focus is on adding value and creating impactful outcomes that shape the future of the business. TBS creates a sustainable competitive advantage for the Tesco Group by becoming the partner of choice for talent, transformation, and value creation.
Posted 5 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As an Infrastructure Engineer III - AWS SRE Engineer at JPMorgan Chase within the Asset & Wealth Management Technology team, your role involves being a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job Responsibilities Demonstrates and champions site reliability culture and practices and exerts technical influence throughout your team. Leads initiatives to improve the reliability and stability of your team’s applications and platforms using data-driven analytics to improve service levels. Collaborates with team members to identify comprehensive service level indicators and stakeholders to establish reasonable service level objectives and error budgets with customers. Demonstrates a high level of technical expertise within one or more technical domains and proactively identifies and solves technology-related bottlenecks in your areas of expertise. Acts as the main point of contact during major incidents for your application and demonstrates the skills to identify and solve issues quickly to avoid financial losses. Documents and shares knowledge within your organization via internal forums and communities of practice Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 3+ years applied experience Experience in Spark, AWS, Python / Java programming Deep proficiency in reliability, scalability, performance, security, enterprise system architecture, toil reduction, and other site reliability best practices with the ability to implement these practices within an application or platform. Fluency in at least one programming language such as (e.g., Python, Java Spring Boot, .Net, etc.) Deep knowledge of software applications and technical processes with emerging depth in one or more technical disciplines Proficiency and experience in observability such as white and black box monitoring, SLO alerting, and telemetry collection using tools such as Grafana, Dynatrace, Prometheus, Datadog, Splunk, etc. Proficiency in continuous integration and continuous delivery tools (e.g., Jenkins, GitLab, Terraform, etc.) Experience with container and container orchestration (e.g., ECS, Kubernetes, Docker, etc.) Experience with troubleshooting common networking technologies and issues Ability to identify and solve problems related to complex data structures and algorithms. Preferred Qualifications, Capabilities, And Skills Drive to self-educate and evaluate new technology Ability to teach new programming languages to team members Ability to expand and collaborate across different levels and stakeholder groups ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team J.P. Morgan Asset & Wealth Management delivers industry-leading investment management and private banking solutions. Asset Management provides individuals, advisors and institutions with strategies and expertise that span the full spectrum of asset classes through our global network of investment professionals. Wealth Management helps individuals, families and foundations take a more intentional approach to their wealth or finances to better define, focus and realize their goals.
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough