Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 9.0 years
5 - 15 Lacs
Hyderabad, Chennai
Work from Office
Key skills: Python, SQL, Pyspark, Databricks, AWS (Manadate) Added advantage: Life sciences/Pharma Roles and Responsibilities 1.Data Pipeline Development: Design, build, and maintain scalable data pipelines for ingesting, processing, and transforming large datasets from diverse sources into usable formats. 2.Data Integration and Transformation: Integrate data from multiple sources, ensuring data is accurately transformed and stored in optimal formats (e.g., Delta Lake, Redshift, S3). 3.Performance Optimization: Optimize data processing and storage systems for cost efficiency and high performance, including managing compute resources and cluster configurations. 4.Automation and Workflow Management: Automate data workflows using tools like Airflow, Data bricks APIs, and other orchestration technologies to streamline data ingestion, processing, and reporting tasks. 5.Data Quality and Validation: Implement data quality checks, validation rules, and transformation logic to ensure the accuracy, consistency, and reliability of data. 6.Cloud Platform Management: Manage and optimize cloud infrastructure (AWS, Databricks) for data storage, processing, and compute resources, ensuring seamless data operations. 7.Migration and Upgrades: Lead migrations from legacy data systems to modern cloud-based platforms, ensuring smooth transitions and enhanced scalability. 8.Cost Optimization: Implement strategies for reducing cloud infrastructure costs, such as optimizing resource usage, setting up lifecycle policies, and automating cost alerts. 9.Data Security and Compliance: Ensure secure access to data by implementing IAM roles and policies, adhering to data security best practices, and enforcing compliance with organizational standards. 10.Collaboration and Support: Work closely with data scientists, analysts, and business teams to understand data requirements and provide support for data-related tasks.
Posted 1 month ago
4.0 - 9.0 years
0 - 1 Lacs
Ahmedabad
Work from Office
Skills & Tools: Platforms: Oracle Primavera P6/EPPM, Microsoft Project Online, Planisware, Clarity PPM Integration Tools: APIs (REST/SOAP), ETL tools (Informatica, Talend), Azure Data Factory IAM/Security: Azure AD, Okta, SAML/OAuth, RBAC, SIEM tools Data Technologies: Data Lakes (e.g., AWS S3, Azure Data Lake), SQL, Power BI/Tableau Languages: Python, SQL, PowerShell, JavaScript (for scripting and integrations) Role & responsibilities Technical Consultant EPPM Platform, Cybersecurity, and Data Integrations Role Overview: As a Technical Consultant, you will be responsible for end-to-end setup and configuration of the Enterprise Project Portfolio Management (EPPM) platform, ensuring secure, efficient, and scalable integrations with enterprise systems including Data Lakes , access control tools, and project governance frameworks. You will work at the intersection of technology, security , and project operations , enabling the business to manage project portfolios effectively. Preferred candidate profile
Posted 1 month ago
6.0 - 11.0 years
25 - 30 Lacs
Bengaluru
Hybrid
Mandatory Skills : Data engineer , AWS Athena, AWS Glue,Redshift,Datalake,Lakehouse,Python,SQL Server Must Have Experience: 6+ years of hands-on data engineering experience Expertise with AWS services: S3, Redshift, EMR, Glue, Kinesis, DynamoDB Building batch and real-time data pipelines Python, SQL coding for data processing and analysis Data modeling experience using Cloud based data platforms like Redshift, Snowflake, Databricks Design and Develop ETL frameworks Nice-to-Have Experience : ETL development using tools like Informatica, Talend, Fivetran Creating reusable data sources and dashboards for self-service analytics Experience using Databricks for Spark workloads or Snowflake Working knowledge of Big Data Processing CI/CD setup Infrastructure-as-code implementation Any one of the AWS Professional Certification
Posted 1 month ago
9.0 - 14.0 years
55 - 60 Lacs
Bengaluru
Hybrid
Dodge Position Title: Technology Lead STG Labs Position Title: Location: Bangalore, India About Dodge Dodge Construction Network exists to deliver the comprehensive data and connections the construction industry needs to build thriving communities. Our legacy is deeply rooted in empowering our customers with transformative insights, igniting their journey towards unparalleled business expansion and success. We serve decision-makers who seek reliable growth and who value relationships built on trust and quality. By combining our proprietary data with cutting-edge software, we deliver to our customers the essential intelligence needed to excel within their respective landscapes. We propel the construction industry forward by transforming data into tangible guidance, driving unparalleled advancement. Dodge is the catalyst for modern construction. https://www.construction.com/ About Symphony Technology Group (STG) STG is a Silicon Valley (California) based private equity firm that has a long and successful track record of transforming high potential software and software-enabled services companies, as well as insights-oriented companies into definitive market leaders. The firm brings expertise, flexibility, and resources to build strategic value and unlock the potential of innovative companies. Partnering to build customer-centric, market winning portfolio companies, STG creates sustainable foundations for growth that bring value to all existing and future stakeholders. The firm is dedicated to transforming and building outstanding technology companies in partnership with world class management teams. With over $5.0 billion in assets under management, including a recently raised $2.0 billion fund. STGs expansive portfolio has consisted of more than 30 global companies. STG Labs is the incubation center for many of STG’s portfolio companies, building their engineering, professional services, and support delivery teams in India. STG Labs offers an entrepreneurial start-up environment for software and AI engineers, data scientists and analysts, project and product managers and provides a unique opportunity to work directly for a software or technology company. Based in Bangalore, STG Labs supports hybrid working. https://stg.com Roles and Responsibilities Lead the design, deployment, and management of data mart and analytics infrastructure leveraging AWS services Implement and manage CI/CD pipelines using industry-leading DevOps practices and tools Design, implement, and oversee API architecture, ensuring robust, scalable, and secure REST API development using AWS API Gateway Collaborate closely with data engineers, architects, and analysts to design highly performant and scalable data solutions. Mentor and guide engineering teams, fostering a culture of continuous learning and improvement. Optimize cloud resources for cost-efficiency, scalability, and reliability. Establish best practices and standards for AWS infrastructure, DevOps processes, API design, and data analytics workflows. Qualifications Hands-on working knowledge and experience is required in: Data Structures Memory Management Basic Algos (Search, Sort, etc) AWS Data Services: Redshift, Glue, EMR, Athena, Lake Formation, Lambda Infrastructure-as-Code Tools: Terraform, AWS CloudFormation Scripting Languages: Python, Bash, SQL DevOps Tooling: Docker, Kubernetes, Jenkins, Bitbucket - must be comfortable in CLI / terminal environments. Command Line / Terminal Environments AWS Security Best Practices Scalable Data Marts, Analytics Systems, and RESTful APIs Hands-on working knowledge and experience is preferred in: Container Orchestration: Kubernetes, EKS Data Visualization & Warehousing: Tableau, Data Warehouse Machine Learning & Big Data Pipelines Certifications Preferred : AWS Certifications (Solutions Architect Professional, DevOps Engineer) (Preferred Skill).
Posted 1 month ago
10.0 - 17.0 years
9 - 19 Lacs
Bengaluru
Remote
Azure Data Engineer Skills Req: Azure Data Engineer Big Data , hadoop Develop and maintain Data Pipelines using Azure services like Data Factory PysparkSynapse, Data Bricks Adobe,Spark Scala etc
Posted 1 month ago
7.0 - 9.0 years
1 - 6 Lacs
Bengaluru
Work from Office
Designation: Data Engineer Job Location: Bangalore About Digit Insurance : Digits mission is to ‘Make Insurance, Simple’. We are backed by Fairfax- one of the largest global investment firms. We have also been ranked as 'LinkedIn top 5 startup of 2018’ and 2019 and are the fastest growing insurance company. We have also been certified as a Great Place to Work! Digit has entered the Unicorn club with a valuation of $1.9 billion and while most companies take about a decade to get here, we have achieved this in just 3 years. And we truly believe that this has happened as a result of the mission of the company i.e. to make insurance simple along with the sheer hard work & endeavors of our employees. We are re-imagining products and redesigning processes to provide simple and transparent insurance solutions, that matter to consumers. We are building a technology-driven platform that can offer customized products at reduced cost and provide a great customer service. We are also the only cloud based Insure-tech company, with a very focused approach towards in house development. We are using new age Technologies like Java Microservices, Full Stack, Angular 6, Python, React Native, DB2, Machine Learning, Data Science, cloud native architecture in AWS, Azure. We are headquartered in Bangalore, Pune, Trivandrum and across India. The team is a great mix of Industry veterans who know what’s working and new age technology specialists who know what could be improved. What are we looking for ? We are looking for candidates to join us as a part of our Data Science team as Data Engineer. Total Experience range? 5 to 8 years of experience in SQL, Python scripting any cloud technologies. Skill Set: Strong proficient in coding specially in Python or any other scripting language. Working knowledge on Linux OS or shell scripting Hands on experience in SQL Working knowledge on any cloud technologies. Exposure to Data-Lake concepts. Roles and Responsibilities : Responsible for End-to-End development of projects which includes understanding requirement, designing solution, implementing, testing and maintenance. Responsible for resolving issues that might occur in existing solutions. Responsible for optimization of existing solutions to save time and resources. Role & responsibilities Preferred candidate profile
Posted 1 month ago
3.0 - 6.0 years
7 - 9 Lacs
Jaipur, Bengaluru
Work from Office
We are seeking a skilled Data Engineer to join ,The ideal candidate will have strong experience with Databricks, Python, SQL, Spark, PySpark, DBT, and AWS, and a solid understanding of big data ecosystems, data lake architecture, and data modeling.
Posted 1 month ago
7.0 - 12.0 years
3 - 7 Lacs
Gurugram
Work from Office
AHEAD builds platforms for digital business. By weaving together advances in cloud infrastructure, automation and analytics, and software delivery, we help enterprises deliver on the promise of digital transformation. AtAHEAD, we prioritize creating a culture of belonging,where all perspectives and voices are represented, valued, respected, and heard. We create spaces to empower everyone to speak up, make change, and drive the culture at AHEAD. We are an equal opportunity employer,anddo not discriminatebased onan individual's race, national origin, color, gender, gender identity, gender expression, sexual orientation, religion, age, disability, maritalstatus,or any other protected characteristic under applicable law, whether actual or perceived. We embraceall candidatesthatwillcontribute to the diversification and enrichment of ideas andperspectives atAHEAD. AHEAD is looking for a Sr. Data Engineer (L3 support) to work closely with our dynamic project teams (both on-site and remotely). This Data Engineer will be responsible for hands-on engineering of Data platforms that support our clients advanced analytics, data science, and other data engineering initiatives. This consultant will build and support modern data environments that reside in the public cloud or multi-cloud enterprise architectures. The Data Engineer will have responsibility for working on a variety of data projects. This includes orchestrating pipelines using modern Data Engineering tools/architectures as well as design and integration of existing transactional processing systems. The appropriate candidate must be a subject matter expert in managing data platforms. Responsibilities: A Sr. Data Engineer should be able to build, operationalize and monitor data processing systems Create robust and automated pipelines to ingest and process structured and unstructured data from various source systems into analytical platforms using batch and streaming mechanisms leveraging cloud native toolset Implement custom applications using tools such as EventHubs, ADF and other cloud native tools as required to address streaming use cases Engineers and maintain ELT processes for loading data lake (Cloud Storage, data lake gen2) Leverages the right tools for the right job to deliver testable, maintainable, and modern data solutions Respond to customer/team inquiries and escalations and assist in troubleshooting and resolving challenges Works with other scrum team members to estimate and deliver work inside of a sprint Research data questions, identifies root causes, and interacts closely with business users and technical resources Should possess ownership and leadership skills to collaborate effectively with Level 1 and Level 2 teams. Must have experience in raising tickets with Microsoft and engaging with them to address any service or tool outages in production. Qualifications: 7+ years of professional technical experience 5+ years of hands-on Data Architecture and Data Modelling SME level 5+ years of experience building highly scalable data solutions using Azure data factory, Spark, Databricks, Python 5+ years of experience working in cloud environments (AWS and/or Azure) 3+ years of programming languages such as Python, Spark and Spark SQL. Should have strong knowledge on architecture of ADF and Databricks. Able to work with Level1 and Level 2 teams to resolve platform outages in production environments. Strong client-facing communication and facilitation skills Strong sense of urgency, ability to set priorities and perform the job with little guidance Excellent written and verbal interpersonal skills and the ability to build and maintain collaborative and positive working relationships at all levels Strong interpersonal and communication skills (Written and oral) required Should be able to work in shifts Should have knowledge on azure Dev Ops process. Key Skills: Azure Data Factory, Azure Data bricks, Python, ETL/ELT, Spark, Data Lake, Data Engineering, EventHubs, Azure delta, Spark streaming Why AHEAD: Through our daily work and internal groups like Moving Women AHEAD and RISE AHEAD, we value and benefit from diversity of people, ideas, experience, and everything in between. We fuel growth by stacking our office with top-notch technologies in a multi-million-dollar lab, by encouraging cross department training and development, sponsoring certifications and credentials for continued learning. USA Employment Benefits include - Medical, Dental, and Vision Insurance - 401(k) - Paid company holidays - Paid time off - Paid parental and caregiver leave - Plus more! See benefits https://www.aheadbenefits.com/ for additional details. The compensation range indicated in this posting reflects the On-Target Earnings (OTE) for this role, which includes a base salary and any applicable target bonus amount. This OTE range may vary based on the candidates relevant experience, qualifications, and geographic location.
Posted 1 month ago
5.0 - 10.0 years
20 - 30 Lacs
Hyderabad
Remote
Hiring for TOP MNC for Data Modeler positon (Long term contract - 2+ Years) The Data Modeler designs and implements data models for Microsoft Fabric and Power BI, supporting the migration from Oracle/Informatica. This offshore role ensures optimized data structures for performance and reporting needs. The successful candidate will bring expertise in data modeling and a collaborative approach. Responsibilities Develop conceptual, logical, and physical data models for Microsoft Fabric and Power BI solutions. Implement data models for relational, dimensional, and data lake environments on target platforms. Collaborate with the Offshore Data Engineer and Onsite Data Modernization Architect to ensure model alignment. Define and govern data modeling standards, tools, and best practices. Optimize data structures for query performance and scalability. Provide updates on modeling progress and dependencies to the Offshore Project Manager. Skills Bachelors or masters degree in computer science, data science, or a related field. 5+ years of data modeling experience with relational and NoSQL platforms. Proficiency with modeling tools (e.g., Erwin, ER/Studio) and SQL. Experience with Microsoft Fabric, data lakes, and BI data structures. Strong analytical and communication skills for team collaboration. Attention to detail with a focus on performance and consistency. management, communication, and presentation
Posted 1 month ago
3.0 - 5.0 years
15 - 27 Lacs
Bengaluru
Work from Office
Job Summary The NetApp Keystone team is responsible for cutting-edge technologies that enable NetApp’s pay as you go offering. Keystone helps customers manage data on prem or in the cloud and have invoices that are charged in a subscription manner.As an engineer in the NetApp’s Keystone organization, you will be executing our most challenging and complex projects. You will be responsible for decomposing complex product requirements into simple solutions, understanding system interdependencies and limitations and engineering best practices. Job Requirements • Strong knowledge of Python programming language, paradigms, constructs, and idioms • Bachelor’s/master’s degree in computer science, information technology, or engineering/ or anything specific that you prefer • Knowledge of various Python frameworks and tools • 2+ year experience working with the Python programming language • Strong written and communication skills with proven fluency in English • Be proficient in writing code for backend and front end • Familiarity with database technologies such as NoSQL, Prometheus and datalake • Hands-on experience with code conversion tools like Git, • Passionate about learning new tools, languages, philosophies, and workflows • Working with generated code and code generation techniques • Knowledge of software development methodologies - SCRUM/AGILE/LEAN • Knowledge of software deployment - Docker/Kubernetes • Knowledge of software team tools - GIT/JIRA/CICD Education Minimum of 2 to 4 years experience required with B.Tech or M.Tech background
Posted 1 month ago
5.0 - 8.0 years
25 - 35 Lacs
Gurugram, Bengaluru
Hybrid
Role & responsibilities Work with data product managers, analysts, and data scientists to architect, build and maintain data processing pipelines in SQL or Python. Build and maintain a data warehouse / data lake-house for analytics, reporting and ML predictions. Implement DataOps and related DevOps focused on creating ETL pipelines for data analytics / reporting, and ELT pipelines for model training. Support, optimise and transition our current processes to ensure well architected implementations and best practices. Work in an agile environment within a collaborative agile product team using Kanban Collaborate across departments and work closely with data science teams and with business (economists/data) analysts in refining their data requirements for various initiatives and data consumption requirements. Educate and train colleagues such as data scientists, analysts, and stakeholders in data pipelining and preparation techniques, which make it easier for them to integrate and consume the data they need for their own use cases. Participate in ensuring compliance and governance during data use, to ensure that the data users and consumers use the data provisioned to them responsibly through data governance and compliance initiatives. Become a data and analytics evangelist, and promote the available data and analytics capabilities and expertise to business unit leaders, and educate them in leveraging these. Preferred candidate profile What you'll need to be successful 8+ years of professional experience with data processing environments used in large scale digital applications. Extensive experience with programming in Python, Spark( SparkSQL) and SQL Experience with warehouse technologies such as Snowflake, and data modelling, lineage and data governance tools such as Alation. Professional experience of designing, building and managing bespoke data pipelines (including ETL, ELT and lambda architectures), using technologies such as Apache Airflow, Snowflake, Amazon Athena, AWS Glue, Amazon EMR, or other equivalent. Strong, fundamental technical expertise in cloud-native technologies, such as serverless functions, API gateway, relational and NoSQL databases, and caching. Experience in leading / mentoring data engineering teams. Experience in working in teams with data scientists and ML engineers, for building automated pipelines for data pre-processing and feature extraction. An advanced degree in software / data engineering, computer / information science, or a related quantitative field or equivalent work experience. Strong verbal and written communication skills and ability to work well with a wide range of stakeholders. Strong ownership, scrappy and biassed for action. Perks and benefits
Posted 1 month ago
5.0 - 7.0 years
15 - 25 Lacs
Chennai
Work from Office
Job Summary: We are seeking a skilled Big Data Tester & Developer to design, develop, and validate data pipelines and applications on large-scale data platforms. You will work on data ingestion, transformation, and testing workflows using tools from the Hadoop ecosystem and modern data engineering stacks. Experience - 6-12 years Key Responsibilities: • Develop and test Big Data pipelines using Spark, Hive, Hadoop, and Kafka • Write and optimize PySpark/Scala code for data processing • Design test cases for data validation, quality, and integrity • Automate testing using Python/Java and tools like Apache Nifi, Airflow, or DBT • Collaborate with data engineers, analysts, and QA teams Key Skills: • Strong hands-on experience in Big Data tools: Spark, Hive, HDFS, Kafka • Proficient in PySpark, Scala, or Java • Experience in data testing, ETL validation, and data quality checks • Familiarity with SQL, NoSQL, and data lakes • Knowledge of CI/CD, Git, and automation frameworks We are looking for a skilled PostgreSQL Developer/DBA to design, implement, optimize, and maintain our PostgreSQL database systems. You will work closely with developers and data teams to ensure high performance, scalability, and data integrity. Experience - 6 to 12 years Key Responsibilities: • Develop complex SQL queries, stored procedures, and functions • Optimize query performance and database indexing • Manage backups, replication, and security • Monitor and tune database performance • Support schema design and data migrations Key Skills: • Strong hands-on experience with PostgreSQL • Proficient in SQL, PL/pgSQL scripting • Experience in performance tuning, query optimization, and indexing • Familiarity with logical replication, partitioning, and extensions • Exposure to tools like pgAdmin, psql, or PgBouncer
Posted 1 month ago
5.0 - 10.0 years
10 - 15 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Hybrid
Designation : Azure Data Engineer Experience : 5+ Years Location: Chennai, Bangalore, Pune, Mumbai Notice Period: Immediate Joiners/ Serving Notice Period Shift Timing: 3:30 PM IST to 12:30 AM IST Job Description : Azure Data Engineer: Must Have Azure Data Bricks, Azure Data Factory, Spark SQL with analytical knowledge Years 6-7 years of development experience in data engineering skills Strong experience in Spark. Understand complex data system by working closely with engineering and product teams Develop scalable and maintainable applications to extract, transform, and load data in various formats to SQL Server, Hadoop Data Lake or other data storage locations. Sincerely, Sonia HR Recruiter Talent Sketchers
Posted 1 month ago
5.0 - 9.0 years
20 - 30 Lacs
Pune
Hybrid
Job Summary : We are looking for a highly skilled AWS Data Engineer with over 5 years of experience in designing, developing, and maintaining scalable data pipelines on AWS. The ideal candidate will be proficient in data engineering best practices and cloud-native technologies, with hands-on experience in building ETL/ELT pipelines, working with large datasets, and optimizing data architecture for analytics and business intelligence. Key Responsibilities : Design, build, and maintain scalable and robust data pipelines and ETL processes using AWS services (e.g., Glue, Lambda, EMR, Redshift, S3, Athena). Collaborate with data analysts, data scientists, and stakeholders to understand data requirements and deliver high-quality solutions. Implement data lake and data warehouse architectures, ensuring data governance, data quality, and compliance. Optimize data pipelines for performance, reliability, scalability, and cost. Automate data ingestion and transformation workflows using Python, PySpark, or Scala. Manage and monitor data infrastructure including logging, error handling, alerting, and performance metrics. Leverage infrastructure-as-code tools like Terraform or AWS CloudFormation for infrastructure deployment. Ensure security best practices are implemented for data access and storage (IAM, KMS, encryption, etc.). Document data processes, architectures, and standards. Required Qualifications : Bachelors or Master’s degree in Computer Science, Information Systems, or a related field. Minimum 5 years of experience as a Data Engineer with a focus on AWS cloud services. Strong experience in building ETL/ELT pipelines using AWS Glue, EMR, Lambda , and Step Functions . Proficiency in SQL , Python , PySpark , and data modeling techniques. Experience working with data lakes (S3) and data warehouses (Redshift, Snowflake, etc.) . Experience with Athena , Kinesis , Kafka , or similar streaming data tools is a plus. Familiarity with DevOps and CI/CD processes, using tools like Git , Jenkins , or GitHub Actions . Understanding of data privacy, governance, and compliance standards such as GDPR, HIPAA, etc. Strong problem-solving and analytical skills, with the ability to work in a fast-paced environment.
Posted 1 month ago
9.0 - 12.0 years
25 - 40 Lacs
Hyderabad
Work from Office
Job Description: GCP Cloud Architect Opportunity: We are seeking a highly skilled and experienced GCP Cloud Architect to join our dynamic technology team. You will play a crucial role in designing, implementing, and managing our Google Cloud Platform (GCP) infrastructure, with a primary focus on building a robust and scalable Data Lake in BigQuery. You will be instrumental in ensuring the reliability, security, and performance of our cloud environment, supporting critical healthcare data initiatives. This role requires strong technical expertise in GCP, excellent problem-solving abilities, and a passion for leveraging cloud technologies to drive impactful solutions within the healthcare domain. Responsibilities: Cloud Architecture & Design: Design and architect scalable, secure, and cost-effective GCP solutions, with a strong emphasis on BigQuery for our Data Lake. Define and implement best GCP infrastructure management, security, networking, and data governance practices. Develop and maintain comprehensive architectural diagrams, documentation, and standards. Collaborate with data engineers, data scientists, and application development teams to understand their requirements and translate them into robust cloud solutions. Evaluate and recommend new GCP services and technologies to optimize our cloud environment. Understand and implement the fundamentals of GCP, including resource hierarchy, projects, organizations, and billing. GCP Infrastructure Management: Manage and maintain our existing GCP infrastructure, ensuring high availability, performance, and security. Implement and manage infrastructure-as-code (IaC) using tools like Terraform or Cloud Deployment Manager. Monitor and troubleshoot infrastructure issues, proactively identifying and resolving potential problems. Implement and manage backup and disaster recovery strategies for our GCP environment. Optimize cloud costs and resource utilization, including BigQuery slot management. Collaboration & Communication: Work closely with cross-functional teams, including data engineering, data science, application development, security, and compliance. Communicate technical concepts and solutions effectively to both technical and non-technical stakeholders. Provide guidance and mentorship to junior team members. Participate in on-call rotation as needed. Develop and maintain thorough and reliable documentation of all cloud infrastructure processes, configurations, and security protocols. Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. Minimum of 5-8 years of experience in designing, implementing, and managing cloud infrastructure, with a strong focus on Google Cloud Platform (GCP). Proven experience in architecting and implementing Data Lakes on GCP, specifically using BigQuery. Hands-on experience with ETL/ELT processes and tools, with strong proficiency in Google Cloud Composer (Apache Airflow). Solid understanding of GCP services such as Compute Engine, Cloud Storage, Networking (VPC, Firewall Rules, Cloud DNS), IAM, Cloud Monitoring, and Cloud Logging. Experience with infrastructure-as-code (IaC) tools like Terraform or Cloud Deployment Manager. Strong understanding of security best practices for cloud environments, including identity and access management, data encryption, and network security. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication, collaboration, and interpersonal skills. Bonus Points: Experience with Apigee for API management. Experience with containerization technologies like Docker and orchestration platforms like Cloud Run. Experience with Vertex AI for machine learning workflows on GCP. Familiarity with GCP Healthcare products and solutions (e.g., Cloud Healthcare API). Knowledge of healthcare data standards and regulations (e.g., HIPAA, HL7, FHIR). GCP Professional Architect certification. Experience with scripting languages (e.g., Python, Bash). Experience with Looker.
Posted 1 month ago
5.0 - 8.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let’s do this. Let’s change the world. In this vital role you will as an expert IS Architect lead the design and implementation of integration frameworks for pharmacovigilance (PV) systems spanning both SaaS and internally hosted. This role focuses on building secure, compliant, and scalable architectures to ensure seamless data flow between safety databases, external systems, and analytics platforms, without direct access to backend databases. The ideal candidate will work closely with PV system collaborators, SaaS vendors, and internal IT teams to deliver robust and efficient solutions. Roles & Responsibilities: Design hybrid integration architectures to manage data flows between SaaS-based PV systems, internally hosted systems and platforms. Implement middleware solutions to bridge on-premise and cloud environments, applying Application Programming Interface API-first integration design pattern and establishing secure data exchange mechanisms to ensure data consistency and compliance. Work with SaaS providers and internal IT teams to define integration approach for Extract Transform Load (ETL), event-driven architecture, and batch processing. Design and maintain end-to-end data flow diagrams and blueprints that consider the unique challenges of hybrid environments. Define and enforce data governance frameworks to maintain data quality, integrity, and traceability across integrated systems. Lead all aspects of data lifecycle management for both cloud and internally hosted systems to ensure consistency and compliance. Act as the main point of contact between pharmacovigilance teams, SaaS vendors, internal IT staff, and other parties to align technical solutions with business goals. Ensure alignment with the delivery and platform teams to safeguard that the applications follow approved Amgen’s architectural and development guidelines as well as data/software standards. Collaborate with analytics teams to ensure timely access to PV data for signal detection, trending, and regulatory reporting. Continuously evaluate and improve integration frameworks to adapt to evolving PV requirements, data volumes, and business needs. Provide technical guidance and mentorship to junior developers. Basic Qualifications Master’s degree with 4 to 6 years of experience in Computer Science, software development or related field Bachelor’s degree with 6 to 8 years of experience in Computer Science, software development or related field Diploma with 10 to 12 years of experience in Computer Science, software development or related field Must-Have Skills: Demonstrable experience in architecting data pipeline and/or integration cross technology landscape (SaaS, Data lake, internally hosted systems) Experience with Application Programming Interface (API integrations) such as MuleSoft and Extract Transform Load (ETL tools) as Informatica platform, Snowflake, or Databricks. Strong problem-solving skills, particularly in hybrid system integrations. Superb communication and collaborator leadership skills, ability to explain technical concepts to non-technical clients Ability to balance technical solutions with business priorities and compliance needs. Passion for using technology to improve pharmacovigilance and patient safety. Experience with data transfer processes and taking on stuck or delayed data files. Knowledge of testing methodologies and quality assurance standard processes. Proficiency in working with data analysis and QA tools. Understanding data flows related to regulations such as GDPR and HIPAA. Experience in SQL/NOSQL database, database programming languages, data modelling concepts. Good-to-Have Skills: Knowledgeable in SDLC, including requirements, design, testing, data analysis, change control Knowledgeable in reporting tools (e.g. Tableau, Power BI) Professional Certifications: SAFe for Architect certification (preferred) Soft Skills: Excellent analytical skills to gather options to deal with ambiguity scenarios. Excellent leadership and progressive thinking abilities Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to balance multiple priorities Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Ability to influence and strive to an intended outcome Ability to hold team members accountable to commitments Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .
Posted 1 month ago
2.0 - 5.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Data Platform Engineer About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What you will do Roles & Responsibilities: Work as a member of a Data Platform Engineering team that uses Cloud and Big Data technologies to design, develop, implement and maintain solutions to support various functional areas like Manufacturing, Commercial, Research and Development. Work closely with the Enterprise Data Lake delivery and platform teams to ensure that the applications are aligned with the overall architectural and development guidelines Research and evaluate technical solutions including Databricks and AWS Services, NoSQL databases, Data Science packages, platforms and tools with a focus on enterprise deployment capabilities like security, scalability, reliability, maintainability, cost management etc. Assist in building and managing relationships with internal and external business stakeholders Develop basic understanding of core business problems and identify opportunities to use advanced analytics Assist in reviewing 3rd party providers for new feature/function/technical fit with EEA's data management needs. Work closely with the Enterprise Data Lake ecosystem leads to identify and evaluate emerging providers of data management & processing components that could be incorporated into data platform. Work with platform stakeholders to ensure effective cost observability and control mechanisms are in place for all aspects of data platform management. Experience developing in an Agile development environment, and comfortable with Agile terminology and ceremonies. Keen on adopting new responsibilities, facing challenges, and mastering new technologies What we expect of you Basic Qualifications and Experience: Master’s degree in computer science or engineering field and 1 to 3 years of relevant experience OR Bachelor’s degree in computer science or engineering field and 3 to 5 years of relevant experience OR Diploma and Minimum of 8+ years of relevant work experience Must-Have Skills: Experience with Databricks (or Snowflake), including cluster setup, execution, and tuning Experience with common data processing librariesPandas, PySpark, SQL-Alchemy. Experience in UI frameworks (Angular.js or React.js) Experience with data lake, data fabric and data mesh concepts Experience with data modeling, performance tuning, and experience on relational databases Experience building ETL or ELT pipelines; Hands-on experience with SQL/NoSQL Program skills in one or more computer languages – SQL, Python, Java Experienced with software engineering best-practices, including but not limited to version control (Git, GitLab.), CI/CD (GitLab, Jenkins etc.), automated unit testing, and Dev Ops Exposure to Jira or Jira Align. Good-to-Have Skills: Knowledge on R language will be considered an advantage Experience in Cloud technologies AWS preferred. Cloud Certifications -AWS, Databricks, Microsoft Familiarity with the use of AI for development productivity, such as GitHub Copilot, Databricks Assistant, Amazon Q Developer or equivalent. Knowledge of Agile and DevOps practices. Skills in disaster recovery planning. Familiarity with load testing tools (JMeter, Gatling). Basic understanding of AI/ML for monitoring. Knowledge of distributed systems and microservices. Data visualization skills (Tableau, Power BI). Strong communication and leadership skills. Understanding of compliance and auditing requirements. Soft Skills: Excellent analytical and solve skills Excellent written and verbal communications skills (English) in translating technology content into business-language at various levels Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem-solving and analytical skills. Strong time and task leadership skills to estimate and successfully meet project timeline with ability to bring consistency and quality assurance across various projects. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 month ago
15.0 - 20.0 years
4 - 8 Lacs
Chennai
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : BlueYonder Order Management Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Role Overview:We are looking for an experienced Integration Architect to lead the design and execution of integration strategies for Blue Yonder (BY) implementations across cloud-native environments. The ideal candidate will possess strong expertise in integrating supply chain platforms with enterprise cloud systems, data lakes, and Snowflake, along with working knowledge of Generative AI (Gen AI) to enhance automation and intelligence in integration and data workflows.Key Responsibilities:Functional Expertise- Must have skill Blue Yonder (BY) Order Promising modules (formerly JDA)- Knowledge of ATP (Available to Promise), CTP (Capable to Promise), and Order Fulfillment logic- Experience with S&OP, Demand Planning, and Inventory Availability functions- Ability to design and interpret supply-demand match rules, sourcing policies, and allocation strategiesTechnical Acumen- Strong grasp of BY architecture, workflows, and configuration capabilities- Proficiency in tools like BY Platform Manager, BY Studio, and BY Workbench- Understanding of data modeling, integration frameworks (REST, SOAP APIs, flat file interfaces), and middleware platforms- Familiarity with PL/SQL, Java, and batch job orchestration for customizations and enhancementsIntegration & Ecosystem Knowledge- Integration experience with OMS, ERP (e.g., SAP, Oracle), WMS, and TMS- Experience in real-time inventory visibility, order brokering, and global ATP engine- Exposure to microservices architecture and cloud deployments (BY Luminate Platform) Implementation & Support Experience- Proven experience in end-to-end BY Order Promising implementations- Ability to conduct solution design workshops, fit-gap analysis, and UAT management- Experience in post-go-live support, performance tuning, and issue triage/resolutionSoft Skills & Project Leadership- Ability to act as a bridge between business and technical teams- Strong stakeholder communication, requirement gathering, and documentation skills- Excellent problem-solving and troubleshooting capabilities- Agile and Waterfall project methodology familiarityPreferred Certifications- Blue Yonder Functional / Technical Certification in Order Promising or Fulfillment- Supply Chain Certifications like APICS / CSCP (desirable) Additional Information:- The candidate should have minimum 5 years of experience in BlueYonder Demand Planning.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Advanced Application Engineer Project Role Description : Utilize modular architectures, next-generation integration techniques and a cloud-first, mobile-first mindset to provide vision to Application Development Teams. Work with an Agile mindset to create value across projects of multiple scopes and scale. Must have skills : BlueYonder Enterprise Supply Planning Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for an experienced Integration Architect to lead the design and execution of integration strategies for Blue Yonder (BY) implementations across cloud-native environments. The ideal candidate will possess strong expertise in integrating supply chain platforms with enterprise cloud systems, data lakes, and Snowflake, along with working knowledge of Generative AI (Gen AI) to enhance automation and intelligence in integration and data workflows. Roles & Responsibilities:- Architect and implement end-to-end integration solutions for Blue Yonder (WMS, TMS, ESP, etc) with enterprise systems (ERP, CRM, legacy).- Design integration flows using cloud-native middleware platforms (Azure Integration Services, AWS Glue, GCP Dataflow, etc.).- Enable real-time and batch data ingestion into cloud-based Data Lakes (e.g., AWS S3, Azure Data Lake, Google Cloud Storage) and downstream to Snowflake.- Develop scalable data pipelines to support analytics, reporting, and operational insights from Blue Yonder and other systems.- Integrate Snowflake as an enterprise data platform for unified reporting and machine learning use cases. Professional & Technical Skills: - Leverage Generative AI (e.g., OpenAI, Azure OpenAI) for :Auto-generating integration mapping specs and documentation.- Enhancing data quality and reconciliation with intelligent agents.- Developing copilots for integration teams to speed up development and troubleshooting.- Ensure integration architecture adheres to security, performance, and compliance standards.- Collaborate with enterprise architects, functional consultants, data engineers, and business stakeholders.- Lead troubleshooting, performance tuning, and hypercare support post-deployment Additional Information:- The candidate should have minimum 5 years of experience in BlueYonder Enterprise Supply Planning.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS BigData Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring project success. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Expected to provide solutions to problems that apply across multiple teams- Lead the application development process effectively- Ensure timely delivery of projects- Provide guidance and mentorship to team members Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS BigData- Strong understanding of cloud computing and AWS services- Experience in designing and implementing Big Data solutions- Knowledge of data warehousing and data lake concepts- Hands-on experience with big data technologies such as Hadoop and Spark Additional Information:- The candidate should have a minimum of 12 years of experience in AWS BigData- This position is based at our Gurugram office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
6.0 - 8.0 years
6 - 10 Lacs
Pune
Work from Office
: Job TitleProduction Specialist, Associate LocationPune, India Role Description Our organization within Deutsche Bank is AFC Production Services. We are responsible for providing technical L2 application support for business applications. The AFC (Anti-Financial Crime) line of business has a current portfolio of 25+ applications. The organization is in process of transforming itself using Google Cloud and many new technology offerings. Your role will include hands-on production support and be actively involved in technical issues resolution across multiple applications. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Provide technical support by handling and consulting on BAU, Incidents/emails/alerts for the respective applications. Perform post-mortem, root cause analysis using ITIL standards of Incident Management, Service Request fulfillment, Change Management, Knowledge Management, and Problem Management. Analyze occurred errors out of the batch processing and interfaces of related systems. Resolution or Workaround determination and implementation Supporting the resolution of high impact incidents on our applications, including attendance at incident bridge calls Escalate incident tickets timely and communicate effectively with business users, development teams, and stakeholders. Providing resolution for open problems or ensuring that the appropriate parties have been tasked with doing so. Supporting the handover from new Projects / Applications into Production Services with Service Transition before Go Life Phase. Assist in the process to approve application code releases as well as tasks assigned to support to perform. Keep key stakeholders informed using communication templates. Automate routine tasks and enhance operational efficiencies through scripts and tools. Support the transition of applications to Google Cloud and new technologies offering. Proactively Identify performance bottlenecks and suggest optimization strategies. Support audit, compliance, and regulatory requirements related to AFC applications. The candidate will have to work in shifts as part of a Rota covering APAC and EMEA hours between 07:00 IST and 09:00 PM IST (2 shifts). In the event of major outages or issues we may ask for flexibility to help provide appropriate cover. Supporting On Call-Support activities Your skills and experience 4-8 years of experience in providing hands on IT application support. Bachelors degree from an accredited college or university with a concentration in Computer Science or IT-related discipline (or equivalent work experience/diploma/certification). Preferred: ITIL v3 foundation certification or higher. Clear and concise documentation in general and especially a proper documentation of the status of incidents, problems, and service requests in the Service Management tool. Monitoring ToolsKnowledge of Elastic Search, Control M, Grafana, Geneos, OpenShift, Prometheus, Google Cloud Monitoring,Airflow, Splunk Red Hat Enterprise Linux (RHEL) professional skill in searching logs, process commands, start/stop processes, use of OS commands to aid in tasks needed to resolve or investigate issues. Shell scripting knowledge a plus. Understanding of database concepts and exposure in working with Oracle, MS SQL, Big Query etc. databases. Ability to work across countries, regions, and time zones with a broad range of cultures and technical capability. Skills That Will Help You Excel Strong written and oral communication skills, including the ability to communicate technical information to a non-technical audience and good analytical and problem-solving skills. Analytical and problem-solving skills, with a structured approach to troubleshooting, issue resolution and its documentation. Able to train, coach, and mentor and know where each technique is best applied. Experience with GCP or another public cloud provider to build applications. Experience in an investment bank, financial institution or large corporation using enterprise hardware and software. Knowledge of Actimize, Mantas, and case management software is good to have. Working knowledge of Big Data Hadoop/Secure Data Lake is a plus. Prior experience in automation projects is great to have. Exposure to python, shell, Ansible or other scripting language for automation and process improvement Strong stakeholder management skills ensuring seamless coordination between business, development, and infrastructure teams. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.
Posted 1 month ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Back At BCE Global Tech, immerse yourself in exciting projects that are shaping the future of both consumer and enterprise telecommunications This involves building innovative mobile apps to enhance user experiences and enable seamless connectivity on-the-go Thrive in diverse roles like Full Stack Developer, Backend Developer, UI/UX Designer, DevOps Engineer, Cloud Engineer, Data Science Engineer, and Scrum Master; at a workplace that encourages you to freely share your bold and different ideas If you are passionate about technology and eager to make a difference, we want to hear from you! Apply now to join our dynamic team in Bengaluru We are seeking a talented Site Reliability Engineer (SRE) to join our team The ideal candidate will have a strong background in software engineering and systems administration, with a passion for building scalable and reliable systems As an SRE, you will collaborate with development and operations teams to ensure our services are reliable, performant, and highly available Key Responsibilities "Ensure the 24/7 operations and reliability of data services in our production GCP and on-premise Hadoop environments Collaborate with the data engineering development team to design, build, and maintain scalable, reliable, and secure data pipelines and systems Develop and implement monitoring, alerting, and incident response strategies to proactively identify and resolve issues before they impact production Drive the implementation of security and reliability best practices across the software development life cycle Contribute to the development of tools and automation to streamline the management and operation of data services Participate in on-call rotation and respond to incidents in a timely and effective manner Continuously evaluate and improve the reliability, scalability, and performance of data services" Technology Skills "4+ years of experience in site reliability engineering or a similar role Strong experience with Google Cloud Platform (GCP) services, including BigQuery, Dataflow, Pub/Sub, and Cloud Storage Experience with on-premise Hadoop environments and related technologies (HDFS, Hive, Spark, etc ) Proficiency in at least one programming language (Python, Scala, Java, Go, etc ) Required qualifications to be successful in this role Bachelors degree in computer science engineering, or related field 8 -10 years of experience as a SRE Proven experience as an SRE, DevOps engineer, or similar role Strong problem-solving skills and ability to work under pressure Excellent communication and collaboration skills Flexible to work in EST time zones ( 9-5 EST) Additional Information Job Type Full Time Work ProfileHybrid (Work from Office/ Remote) Years of Experience8-10 Years LocationBangalore What We Offer Competitive salaries and comprehensive health benefits Flexible work hours and remote work options Professional development and training opportunities A supportive and inclusive work environment
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Are you ready to play a key role in transforming Thomson Reuters into a truly data-driven companyJoin our Data & Analytics (D&A) function and be part of the strategic ambition to build, embed, and mature a data-driven culture across the entire organization. The Data Architecture organization within the Data and Analytics division is responsible for designing and implementing a unified data strategy that enables the efficient, secure, and governed use of data across the organization. We aim to create a trusted and customer-centric data ecosystem, built on a foundation of data quality, security, and openness, and guided by the Thomson Reuters Trust Principles. Our team is dedicated to developing innovative data solutions that drive business value while upholding the highest standards of data management and ethics. About the Role In this opportunity as a Data Architect, you will: Lead Architecture Design: Architect and Lead Data Platform EvolutionSpearhead the conceptual, logical, and physical architecture design for our enterprise Data Platform (encompassing areas like our data lake, data warehouse, streaming services, and master data management systems). You will define and enforce data modeling standards, data flow patterns, and integration strategies to serve a diverse audience from data engineers to AI/ML practitioners and BI analysts. Technical Standards and Best Practices: Research and recommend technical standards, ensuring the architecture aligns with overall technology and product strategy. Be hands-on in implementing core components reusable across applications. Hands-on Prototyping and Framework DevelopmentWhile a strategic role, maintain a hands-on approach by designing and implementing proof-of-concepts and core reusable components/frameworks for the data platform. This includes developing best practices and templates for data pipelines, particularly leveraging dbt for transformations, and ensuring efficient data processing and quality. Champion Data Ingestion StrategiesDesign and oversee the implementation of robust, scalable, and automated cloud data ingestion pipelines from a variety of sources (e.g., APIs, databases, streaming feeds) into our AWS-based data platform, utilizing services such as AWS Glue, Kinesis, Lambda, S3, and potentially third-party ETL/ELT tools. Design and optimize solutions utilizing our core cloud data stack, including deep expertise in Snowflake (e.g., architecture, performance tuning, security, data sharing, Snowpipe, Streams, Tasks) and a broad range of AWS data services (e.g., S3, Glue, EMR, Kinesis, Lambda, Redshift, DynamoDB, Athena, Step Functions, MWAA/Managed Airflow) to build and automate end-to-end analytics and data science workflows. Data-Driven Decision-Making: Make quick and effective data-driven decisions, demonstrating strong problem-solving and analytical skills. Align strategies with company goals. Stakeholder Collaboration: Collaborate closely with external and internal stakeholders, including business teams and product managers. Define roadmaps, understand functional requirements, and lead the team through the end-to-end development process. Team Collaboration: Work in a collaborative team-oriented environment, sharing information, diverse ideas, and partnering with cross-functional and remote teams. Quality and Continuous Improvement: Focus on quality, continuous improvement, and technical standards. Keep service focus on reliability, performance, and scalability while adhering to industry best practices. Technology Advancement: Continuously update yourself with next-generation technology and development tools. Contribute to process development practices. About You Youre a fit for the role of Data Architect, Data Platform if your background includes: Educational Background: Bachelors degree in information technology. Experience: 10 + years of IT experience with at least 5 years in a lead design or architectural capacity. Technical Expertise: Broad knowledge and experience with Cloud-native software design, Microservices architecture, Data Warehousing, and proficiency in Snowflake. Cloud and Data Skills: Experience with building and automating end-to-end analytics pipelines on AWS, familiarity with NoSQL databases. Data Pipeline and Ingestion MasteryExtensive experience in designing, building, and automating robust and scalable cloud data ingestion frameworks and end-to-end data pipelines on AWS. This includes experience with various ingestion patterns (batch, streaming, CDC) and tools. Data Modeling: Proficient with concepts of data modeling and data development lifecycle. Advanced Data ModelingDemonstrable expertise in designing and implementing various data models (e.g., relational, dimensional, Data Vault, NoSQL schemas) for transactional, analytical, and operational workloads. Strong understanding of the data development lifecycle, from requirements gathering to deployment and maintenance. Leadership: Proven ability to lead architectural discussions, influence technical direction, and mentor data engineers, effectively balancing complex technical decisions with user needs and overarching business constraints Programming Skills: Strong programming skills in languages such as Python or Java or data manipulation, automation, and API development. Regulatory Awareness and Security Acumen: Data Governance and Security AcumenDeep understanding and practical experience in designing and implementing solutions compliant with robust data governance principles, data security best practices (e.g., encryption, access controls, masking), and relevant privacy regulations (e.g., GDPR, CCPA). Containerization and Orchestration: Experience with containerization technologies like Docker and orchestration tools like Kubernetes. #LI-VN1 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 1 month ago
2.0 - 6.0 years
10 - 15 Lacs
Ahmedabad
Work from Office
We are seeking a highly skilled and innovative AI & MLTechnology Specialist to drive AI initiatives within our HR function. Theideal candidate will explore and implement advanced AI and machine learningsolutions to enhance HR processes, leveraging cutting-edge technologies such as data lakes, AI/ML bots, predictive analytics, and automation frameworks . Key Responsibilities: Develop AI-driven HR Solutions : Identify and implement AI and ML applications to optimize recruitment, onboarding, employee engagement, performance management, and workforce planning. Data Management & Analytics : Design and manage high-end HR data lakes , ensuring data integrity, security, and accessibility for advanced analytics. AI/ML Bot Development : Work on intelligent HR chatbots for employee queries, HR service automation, and improving user experience. Predictive Workforce Analytics : Utilize machine learning models to analyze workforce trends, predict attrition, assess employee satisfaction, and optimize talent management strategies. Collaborate with HR & IT Teams : Partner with cross-functional teams to understand business needs, develop AI-driven HR solutions, and ensure seamless integration with existing HR systems. Research & Continuous Innovation : Stay up-to-date with emerging AI/ML trends, tools, and frameworks, recommending best practices for HR transformation. Qualifications & Skills: Bachelor's/Masters degree in Computer Science, Data Science, Artificial Intelligence, or a related field . Proven experience in AI & ML technologies , with a focus on HR applications. Strong knowledge of data lakes, predictive analytics, NLP, chatbot development, and automation . Proficiency in programming languages such as Python, R, TensorFlow, or PyTorch . Experience with HR tech platforms, cloud-based AI solutions, and big data analytics is a plus. Excellent problem-solving skills, analytical mindset, and ability to communicate technical concepts to non-technical stakeholders.
Posted 1 month ago
7.0 - 12.0 years
0 - 2 Lacs
Pune, Ahmedabad, Gurugram
Work from Office
Urgent Hiring: Azure Data Engineer (Strong PySpark + SCD II/III Expert) Work Mode: Remote Client-Focused Interview on PySpark + SCD II/III Key Must-Haves: Very Strong hands-on PySpark coding Practical experience implementing Slowly Changing Dimensions (SCD) Type II and Type III Strong expertise in Azure Data Engineering (ADF, Databricks, Data Lake, Synapse) Proficiency in SQL and Python for scripting and transformation Strong understanding of data warehousing concepts and ETL pipelines Good to Have: Experience with Microsoft Fabric Familiarity with Power BI Domain knowledge in Finance, Procurement, and Human Capital Note: This role is highly technical. The client will focus interviews on PySpark coding and SCD Type II/III implementation . Only share profiles that are hands-on and experienced in these areas. Share strong, relevant profiles to: b.simrana@ekloudservices.com
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France