Jobs
Interviews

3139 Redshift Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Do you enjoy hands-on technical work? Do you enjoy being part of a team that ensures the highest quality? Join our Digital Technology-Data & Analytics team Baker Hughes' Digital Technology team provide and create tech solutions to cater to the needs of our customers. As a global team we collaborative to provide cutting-edge solutions to solve our customer's problems. We support them by providing materials management, planning, inventory and warehouse solutions. Take ownership for innovative Data Analytics projects The Data Engineering team helps solve our customers' toughest challenges; making flights safer, power cheaper, and oil & gas production safer for people and the environment by leveraging data and analytics. Data Architect will work on the projects in a technical domain as a technical leader and architect in Data & Information. Will be responsible for data handling, data warehouse building and architecture. As a Senior Data Engineer, you will be responsible for: Designing and implementing scalable and robust data pipelines to collect, process, and store data from various sources. Developing and maintaining data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation. Optimizing and tuning the performance of data systems to ensure efficient data processing and analysis. Collaborating with product managers and analysts to understand data requirements and implement solutions for data modeling and analysis. Identifying and resolving data quality issues, ensuring data accuracy, consistency, and completeness. Implement and maintain data governance and security measures to protect sensitive data. Monitoring and troubleshooting data infrastructure, perform root cause analysis, and implement necessary fixes. Ensuring the use of state-of-the-art methodologies to carry out job in the most productive and effective way. This may include research activities to identify and introduce new technologies in the field of data acquisition and data analysis Fuel your passion To be successful in this role you will: Have a Bachelors in Computer Science or “STEM” Majors (Science, Technology, Engineering and Math) with a minimum 4 years of experience Have Proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems. Have Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle). Have Experience with building complex jobs for building SCD type mappings using ETL tools like Talend, Informatica, etc Have Experience with data visualization and reporting tools (e.g., Tableau, Power BI). Have strong problem-solving and analytical skills, with the ability to handle complex data challenges. Excellent communication and collaboration skills to work effectively in a team environment. Have Experience in data modeling, data warehousing and ETL principles. Have Familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery). Have Advanced knowledge of distributed computing and parallel processing. Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink). (Good to have) Have Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes). Have certification in relevant technologies or data engineering disciplines. Work in a way that works for you We recognize that everyone is different and that the way in which people want to work and deliver at their best is different for everyone too. In this role, we can offer the following flexible working patterns: Working flexible hours - flexing the times when you work in the day to help you fit everything in and work when you are the most productive Working with us Our people are at the heart of what we do at Baker Hughes. We know we are better when all of our people are developed, engaged and able to bring their whole authentic selves to work. We invest in the health and well-being of our workforce, train and reward talent and develop leaders at all levels to bring out the best in each other. Working for you Our inventions have revolutionized energy for over a century. But to keep going forward tomorrow, we know we have to push the boundaries today. We prioritize rewarding those who embrace change with a package that reflects how much we value their input. Join us, and you can expect: Contemporary work-life balance policies and wellbeing activities Comprehensive private medical care options Safety net of life insurance and disability programs Tailored financial programs Additional elected or voluntary benefits About Us: We are an energy technology company that provides solutions to energy and industrial customers worldwide. Built on a century of experience and conducting business in over 120 countries, our innovative technologies and services are taking energy forward – making it safer, cleaner and more efficient for people and the planet. Join Us: Are you seeking an opportunity to make a real difference in a company that values innovation and progress? Join us and become part of a team of people who will challenge and inspire you! Let’s come together and take energy forward. Baker Hughes Company is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law. R149189

Posted 1 week ago

Apply

0 years

0 Lacs

Krishnagiri, Tamil Nadu, India

On-site

Experience with cloud databases and data warehouses (AWS Aurora, RDS/PG, Redshift, DynamoDB, Neptune). Building and maintaining scalable real-time database systems using the AWS stack (Aurora, RDS/PG, Lambda) to enhance business decision-making capabilities. Provide valuable insights and contribute to the design, development, and architecture of data solutions. Experience utilizing various design and coding techniques to improve query performance. Expertise in performance optimization, capacity management, and workload management. Working knowledge of relational database internals (locking, consistency, serialization, recovery paths). Awareness of customer workloads and use cases, including performance, availability, and scalability. Monitor database health and promptly identify and resolve issues. Maintain comprehensive documentation for databases, business continuity plan, cost usage and processes. Proficient in using Terraform or Ansible for database provisioning and infrastructure management. Additional 'nice-to-have' expertise in Python, Databricks, Apache Airflow, Google Cloud Platform (GCP), and Microsoft Azure.

Posted 1 week ago

Apply

6.0 - 7.0 years

27 - 35 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role & responsibilities Provide technical leadership and mentorship to data engineering teams. Architect, design, and deploy scalable, secure, and high-performance data pipelines. Collaborate with stakeholders, clients, and cross-functional teams to deliver end-to-end data solutions. Drive technical strategy and implementation plans in alignment with business needs. Oversee project execution using tools like JIRA, ensuring timely delivery and adherence to best practices. Implement and maintain CI/CD pipelines and automation tools to streamline development workflows. Promote best practices in data engineering and AWS implementations across the team. Preferred candidate profile Strong hands-on expertise in Python, PySpark, and Spark architecture, including performance tuning and optimization. Advanced proficiency in SQL and experience in writing optimized stored procedures. In-depth knowledge of the AWS data engineering stack, including: AWS Glue Lambda API Gateway EMR S3 Redshift Athena Experience with Infrastructure as Code (IaC) using CloudFormation and Terraform. Familiarity with Unix/Linux scripting and system administration is a plus. Proven ability to design and deploy robust, production-grade data solutions.

Posted 1 week ago

Apply

3.0 - 5.0 years

10 - 15 Lacs

Mumbai, Aurangabad

Work from Office

Joining Preferred immediate joiners. Job Summary: We are seeking a skilled and motivated Data Developer with 3 to 5 years of hands-on experience in designing, developing, and maintaining scalable data solutions. The ideal candidate will work closely with data architects, data analysts, and application developers to build efficient data pipelines, transform data, and support data integration across various platforms. Key Responsibilities: Design, develop, and maintain ETL/ELT pipelines to ingest, transform, and load data from various sources (structured and unstructured). Develop and optimize SQL queries , stored procedures, views, and functions for data analysis and reporting. Work with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery, Azure Synapse) to support business intelligence solutions. Collaborate with data engineers and analysts to implement robust data models and schemas for analytics. Ensure data quality , consistency, and accuracy through data validation, testing, and monitoring. Implement data security, compliance, and governance protocols in alignment with organizational policies. Maintain documentation related to data sources, data flows, and business rules. Participate in code reviews , sprint planning, and agile development practices. Technical Skills Required: Languages & Tools: SQL (Advanced proficiency required) Python or Scala for data processing Shell scripting (Bash, PowerShell) ETL Tools / Data Integration: Apache NiFi, Talend, Informatica, Azure Data Factory, SSIS, or equivalent Data Warehousing & Databases: Snowflake, Amazon Redshift, Google BigQuery, Azure Synapse SQL Server, PostgreSQL, Oracle, or MySQL Cloud Platforms (at least one): AWS (Glue, S3, Redshift, Lambda) Azure (ADF, Blob Storage, Synapse) GCP (Dataflow, BigQuery, Cloud Storage) Big Data & Streaming (Nice to Have): Apache Spark, Databricks, Kafka, Hadoop ecosystem Version Control & DevOps: Git, Bitbucket CI/CD pipelines (Jenkins, GitHub Actions) Qualifications: Bachelors or Masters degree in Computer Science , Information Systems , or related field. 35 years of professional experience as a Data Developer or Data Engineer. Strong problem-solving skills and the ability to work both independently and in a team environment. Experience working in Agile/Scrum teams is a plus. Excellent communication and documentation skills. Preferred Certifications (Optional): Microsoft Certified: Azure Data Engineer Associate AWS Certified Data Analytics Specialty Google Cloud Professional Data Engineer

Posted 1 week ago

Apply

1.0 years

7 - 12 Lacs

Bengaluru, Karnataka, India

On-site

This role is for one of the Weekday's clients Salary range: Rs 700000 - Rs 1200000 (ie INR 7-12 LPA) Min Experience: 1 years Location: Bengaluru JobType: full-time We are seeking a detail-oriented and analytical Data Analyst with 1-3 years of hands-on experience in data analytics, database management, and reporting. In this role, you will be responsible for collecting, analyzing, and interpreting large datasets to help drive data-informed decisions across the business. Your expertise in SQL and Python will be key to building scalable data pipelines, running ad-hoc queries, generating actionable insights, and automating reporting processes. This is an exciting opportunity to work closely with cross-functional teams such as Product, Engineering, Marketing, and Operations to solve real business problems using data. The ideal candidate is passionate about data, enjoys working with numbers, and is excited to contribute to data-driven strategy and operational excellence. Requirements Key Responsibilities: Extract, clean, and analyze structured and unstructured data from various sources using SQL and Python. Develop and maintain dashboards, reports, and visualizations to communicate key business metrics. Work with stakeholders to define KPIs and create automated data pipelines to track performance over time. Translate business requirements into technical specifications and analytical queries. Perform exploratory data analysis (EDA) to identify trends, correlations, and outliers. Support A/B testing and experimental design by analyzing results and generating insights. Collaborate with engineering teams to ensure data integrity, quality, and accessibility. Document data definitions, processes, and analysis workflows for transparency and reproducibility. Required Skills & Qualifications: Bachelor's degree in Computer Science, Statistics, Mathematics, Engineering, Economics, or a related field. 1-3 years of professional experience in a data analyst or similar role. Strong proficiency in writing efficient and optimized SQL queries for data extraction and manipulation. Solid experience with Python for data analysis (Pandas, NumPy, Matplotlib/Seaborn, etc.). Experience working with relational databases (e.g., MySQL, PostgreSQL, SQL Server). Understanding of data wrangling, cleaning, and preprocessing techniques. Familiarity with data visualization tools such as Power BI, Tableau, or similar platforms. Strong analytical mindset with the ability to translate data into business insights. Excellent communication skills with the ability to present findings clearly to technical and non-technical stakeholders. Good to Have (Not Mandatory): Experience with cloud data warehouses (e.g., Snowflake, BigQuery, Redshift). Exposure to version control systems like Git. Understanding of basic statistics and hypothesis testing. Knowledge of APIs and data integration techniques

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Acuity Knowledge Partners (Acuity) is a leading provider of bespoke research, analytics and technology solutions to the financial services sector, including asset managers, corporate and investment banks, private equity and venture capital firms, hedge funds and consulting firms. Its global network of over 6,000 analysts and industry experts, combined with proprietary technology, supports more than 600 financial institutions and consulting companies to operate more efficiently and unlock their human capital, driving revenue higher and transforming operations. Acuity is headquartered in London and operates from 10 locations worldwide. The company fosters a diverse, equitable and inclusive work environment, nurturing talent, regardless of race, gender, ethnicity or sexual orientation. Acuity was established as a separate business from Moody’s Corporation in 2019, following its acquisition by Equistone Partners Europe (Equistone). In January 2023, funds advised by global private equity firm Permira acquired a majority stake in the business from Equistone, which remains invested as a minority shareholder. For more information, visit www.acuitykp.com Position Title- Associate Director (Senior Architect – Data) Department-IT Location- Gurgaon/ Bangalore Job Summary The Enterprise Data Architect will enhance the company's strategic use of data by designing, developing, and implementing data models for enterprise applications and systems at conceptual, logical, business area, and application layers. This role advocates data modeling methodologies and best practices. We seek a skilled Data Architect with deep knowledge of data architecture principles, extensive data modeling experience, and the ability to create scalable data solutions. Responsibilities include developing and maintaining enterprise data architecture, ensuring data integrity, interoperability, security, and availability, with a focus on ongoing digital transformation projects. Key Responsibilities 1. Strategy & Planning o Develop and deliver long-term strategic goals for data architecture vision and standards in conjunction with data users, department managers, clients, and other key stakeholders. o Create short-term tactical solutions to achieve long-term objectives and an overall data management roadmap. o Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity. o Establish methods and procedures for tracking data quality, completeness, redundancy, and improvement. o Conduct data capacity planning, life cycle, duration, usage requirements, feasibility studies, and other tasks. o Create strategies and plans for data security, backup, disaster recovery, business continuity, and archiving. o Ensure that data strategies and architectures are aligned with regulatory compliance. o Develop a comprehensive data strategy in collaboration with different stakeholders that aligns with the transformational projects’ goals. o Ensure effective data management throughout the project lifecycle. 2. Acquisition & Deployment o Ensure the success of enterprise-level application rollouts (e.g. ERP, CRM, HCM, FP&A, etc.) Liaise with vendors and service providers to select the products or services that best meet company goals 3. Operational Management o Assess and determine governance, stewardship, and frameworks for managing data across the organization. o Develop and promote data management methodologies and standards. o Document information products from business processes and create data entities o Create entity relationship diagrams to show the digital thread across the value streams and enterprise o Create data normalization across all systems and data base to ensure there is common definition of data entities across the enterprise o Document enterprise reporting needs develop the data strategy to enable single source of truth for all reporting data o Address the regulatory compliance requirements of each country and ensure our data is secure and compliant o Select and implement the appropriate tools, software, applications, and systems to support data technology goals. o Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. o Collaborate with project managers and business unit leaders for all projects involving enterprise data. o Address data-related problems regarding systems integration, compatibility, and multiple-platform integration. o Act as a leader and advocate of data management, including coaching, training, and career development to staff. o Develop and implement key components as needed to create testing criteria to guarantee the fidelity and performance of data architecture. o Document the data architecture and environment to maintain a current and accurate view of the larger data picture. o Identify and develop opportunities for data reuse, migration, or retirement. 4. Data Architecture Design: o Develop and maintain the enterprise data architecture, including data models, databases, data warehouses, and data lakes. o Design and implement scalable, high-performance data solutions that meet business requirements. 5. Data Governance: o Establish and enforce data governance policies and procedures as agreed with stakeholders. o Maintain data integrity, quality, and security within Finance, HR and other such enterprise systems. 6. Data Migration: o Oversee the data migration process from legacy systems to the new systems being put in place. o Define & Manage data mappings, cleansing, transformation, and validation to ensure accuracy and completeness. 7. Master Data Management: o Devise processes to manage master data (e.g., customer, vendor, product information) to ensure consistency and accuracy across enterprise systems and business processes. o Provide data management (create, update and delimit) methods to ensure master data is governed 8. Stakeholder Collaboration: o Collaborate with various stakeholders, including business users, other system vendors, and stakeholders to understand data requirements. o Ensure the enterprise system meets the organization's data needs. 9. Training and Support: o Provide training and support to end-users on data entry, retrieval, and reporting within the candidate enterprise systems. o Promote user adoption and proper use of data. Data Quality Assurance: o Implement data quality assurance measures to identify and correct data issues. o Ensure the Oracle Fusion and other enterprise systems contain reliable and up-to-date information. 11. Reporting and Analytics: o Facilitate the development of reporting and analytics capabilities within the Oracle Fusion and other systems o Enable data-driven decision-making through robust data analysis. 12. Continuous Improvement: o Continuously monitor and improve data processes and the Oracle Fusion and other system's data capabilities. o Leverage new technologies for enhanced data management to support evolving business needs. Technology and Tools: Oracle Fusion Cloud Data modeling tools (e.g., ER/Studio, ERwin) ETL tools (e.g., Informatica, Talend, Azure Data Factory) Data Pipelines: Understanding of data pipeline tools like Apache Airflow and AWS Glue. Database management systems: Oracle Database, MySQL, SQL Server, PostgreSQL, MongoDB, Cassandra, Couchbase, Redis, Hadoop, Apache Spark, Amazon RDS, Google BigQuery, Microsoft Azure SQL Database, Neo4j, OrientDB, Memcached) Data governance tools (e.g., Collibra, Informatica Axon, Oracle EDM, Oracle MDM) Reporting and analytics tools (e.g., Oracle Analytics Cloud, Power BI, Tableau, Oracle BIP) Hyperscalers / Cloud platforms (e.g., AWS, Azure) Big Data Technologies such as Hadoop, HDFS, MapReduce, and Spark Cloud Platforms such as Amazon Web Services, including RDS, Redshift, and S3, Microsoft Azure services like Azure SQL Database and Cosmos DB and experience in Google Cloud Platform services such as BigQuery and Cloud Storage. Programming Languages: (e.g. using Java, J2EE, EJB, .NET, WebSphere, etc.) SQL: Strong SQL skills for querying and managing databases. Python: Proficiency in Python for data manipulation and analysis. Java: Knowledge of Java for building data-driven applications. Data Security and Protocols: Understanding of data security protocols and compliance standards. Key Competencies Educational Qualification: Bachelor’s degree in computer science, Information Technology, or a related field. Master’s degree preferred. Experience: 10+ years overall and at least 7 years of experience in data architecture, data modeling, and database design. Proven experience with data warehousing, data lakes, and big data technologies. Expertise in SQL and experience with NoSQL databases. Experience with cloud platforms (e.g., AWS, Azure) and related data services. Experience with Oracle Fusion or similar ERP systems is highly desirable. Skills: Strong understanding of data governance and data security best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work effectively in a collaborative team environment. Leadership experience with a track record of mentoring and developing team members. Excellent in documentation and presentations. Good knowledge of applicable data privacy practices and laws. Certifications: Relevant certifications (e.g., Certified Data Management Professional, AWS Certified Big Data – Specialty) are a plus.

Posted 1 week ago

Apply

12.0 years

1 - 3 Lacs

Hyderābād

Remote

Overview: As Data Modelling Assoc Manager, you will be the key technical expert overseeing data modeling and drive a strong vision for how data modelling can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data modelers who create data models for deploying in Data Foundation layer and ingesting data from various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data modelling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will independently be analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be a key technical expert performing all aspects of Data Modelling working closely with Data Governance, Data Engineering and Data Architects teams. You will provide technical guidance to junior members of the team as and when needed. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities: Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, Data Bricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling – documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Advocates existing Enterprise Data Design standards; assists in establishing and documenting new standards. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the data science team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications: 12+ years of overall technology experience that includes at least 6+ years of data modelling and systems architecture. 6+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 6+ years of experience developing enterprise data models. 6+ years in cloud data engineering experience in at least one cloud (Azure, AWS, GCP). 6+ years of experience with building solutions in the retail or in the supply chain space. Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models). Fluent with Azure cloud services. Azure Certification is a plus. Experience scaling and managing a team of 5+ data modelers Experience with integration of multi cloud services with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring, hiring and scaling data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals Ability to lead others without direct authority in a matrixed environment. Differentiating Competencies Required Ability to work with virtual teams (remote work locations); lead team of technical resources (employees and contractors) based in multiple locations across geographies Lead technical discussions, driving clarity of complex issues/requirements to build robust solutions Strong communication skills to meet with business, understand sometimes ambiguous, needs, and translate to clear, aligned requirements Able to work independently with business partners to understand requirements quickly, perform analysis and lead the design review sessions. Highly influential and having the ability to educate challenging stakeholders on the role of data and its purpose in the business. Places the user in the centre of decision making. Teams up and collaborates for speed, agility, and innovation. Experience with and embraces agile methodologies. Strong negotiation and decision-making skill. Experience managing and working with globally distributed teams

Posted 1 week ago

Apply

5.0 years

6 - 8 Lacs

Hyderābād

Remote

DESCRIPTION Want to join the Earth’s most customer centric company? Do you like to dive deep to understand problems? Are you someone who likes to challenge Status Quo? Do you strive to excel at goals assigned to you? If yes, we have opportunities for you. Global Operations – Artificial Intelligence (GO-AI) at Amazon is looking to hire candidates who can excel in a fast-paced dynamic environment. Are you somebody that likes to use and analyze big data to drive business decisions? Do you enjoy converting data into insights that will be used to enhance customer decisions worldwide for business leaders? Do you want to be part of the data team which measures the pulse of innovative machine vision-based projects? If your answer is yes, join our team. GO-AI is looking for a motivated individual with strong skills and experience in resource utilization planning, process optimization and execution of scalable and robust operational mechanisms, to join the GO-AI Ops DnA team. In this position you will be responsible for supporting our sites to build solutions for the rapidly expanding GO-AI team. The role requires the ability to work with a variety of key stakeholders across job functions with multiple sites. We are looking for an entrepreneurial and analytical program manager, who is passionate about their work, understands how to manage service levels across multiple skills/programs, and who is willing to move fast and experiment often. Key job responsibilities Ability to maintain and refine straightforward ETL and write secure, stable, testable, maintainable code with minimal defects and automate manual processes. Proficiency in one or more industry analytics visualization tools (e.g. Excel, Tableau/Quicksight/PowerBI) and, as needed, statistical methods (e.g. t-test, Chi-squared) to deliver actionable insights to stakeholders. Building and owning small to mid-size BI solutions with high accuracy and on time delivery using data sets, queries, reports, dashboards, analyses or components of larger solutions to answer straightforward business questions with data incorporating business intelligence best practices, data management fundamentals, and analysis principles. Good understanding of the relevant data lineage: including sources of data; how metrics are aggregated; and how the resulting business intelligence is consumed, interpreted and acted upon by the business where the end product enables effective, data-driven business decisions. Having high responsibility for the code, queries, reports and analyses that are inherited or produced and having analyses and code reviewed periodically. Effective partnering with peer BIEs and others in your team to troubleshoot, research root causes, propose solutions, by either take ownership for their resolution or ensure a clear hand-off to the right owner. About the team The Global Operations – Artificial Intelligence (GO-AI) team is an initiative, which remotely handles exceptions in the Amazon Robotic Fulfillment Centers Globally. GO-AI seeks to complement automated vision based decision-making technologies by providing remote human support for the subset of tasks which require higher cognitive ability and cannot be processed through automated decision making with high confidence. This team provides end-to-end solutions through inbuilt competencies of Operations and strong central specialized teams to deliver programs at Amazon scale. It is operating multiple programs and other new initiatives in partnership with global technology and operations teams. BASIC QUALIFICATIONS 5+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling PREFERRED QUALIFICATIONS Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, TS, Hyderabad Corporate Operations

Posted 1 week ago

Apply

8.0 - 10.0 years

6 - 8 Lacs

Hyderābād

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Principal IS Bus Sys Analyst, Neural Nexus What you will do Let’s do this. Let’s change the world. In this vital role you will support the delivery of emerging AI/ML capabilities within the Commercial organization as a leader in Amgen's Neural Nexus program. We seek a technology leader with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders. Are you interested in building a team that consistently delivers business value in an agile model using technologies such as AWS, Databricks, Airflow, and Tableau? Come join our team! Roles & Responsibilities: Establish an effective engagement model to collaborate with the Commercial Data & Analytics (CD&A) team to help realize business value through the application of commercial data and emerging AI/ML technologies. Serve as the technology product owner for the launch and growth of the Neural Nexus product teams focused on data connectivity, predictive modeling, and fast-cycle value delivery for commercial teams. Lead and mentor junior team members to deliver on the needs of the business Interact with business clients and technology management to create technology roadmaps, build cases, and drive DevOps to achieve the roadmaps. Help to mature Agile operating principles through deployment of creative and consistent practices for user story development, robust testing and quality oversight, and focus on user experience. Become the subject matter expert in emerging technology capabilities by researching and implementing new tools and features, internal and external methodologies. Build expertise and domain expertise in a wide variety of Commercial data domains. Provide input for governance discussions and help prepare materials to support executive alignment on technology strategy and investment. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree with 8 - 10 years of experience in Information Systems experience OR Bachelor’s degree with 10 - 14 years of experience in Information Systems experience OR Diploma with 14 - 18 years of experience in Information Systems experience Excellent problem-solving skills and a passion for tackling complex challenges in data and analytics with technology Experience leading data and analytics teams in a Scaled Agile Framework (SAFe) Good interpersonal skills, good attention to detail, and ability to influence based on data and business value Ability to build compelling business cases with accurate cost and effort estimations Has experience with writing user requirements and acceptance criteria in agile project management systems such as Jira Ability to explain sophisticated technical concepts to non-technical clients Good understanding of sales and incentive compensation value streams Technical Skills: ETL tools: Experience in ETL tools such as Databricks Redshift or equivalent cloud-based dB Big Data, Analytics, Reporting, Data Lake, and Data Integration technologies S3 or equivalent storage system AWS (similar cloud-based platforms) BI Tools (Tableau and Power BI preferred) Preferred Qualifications: Jira Align & Confluence experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Understanding of software systems strategy, governance, and infrastructure Experience in managing product features for PI planning and developing product roadmaps and user journeys Familiarity with low-code, no-code test automation software Technical thought leadership Soft Skills: Able to work effectively across multiple geographies (primarily India, Portugal, and the United States) under minimal supervision Demonstrated proficiency in written and verbal communication in English language Skilled in providing oversight and mentoring team members. Demonstrated ability in effectively delegating work Intellectual curiosity and the ability to question partners across functions Ability to prioritize successfully based on business value High degree of initiative and self-motivation Ability to manage multiple priorities successfully across virtual teams Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

5.0 years

6 - 20 Lacs

India

On-site

Job Description: Senior Database Developer (MySQL & AWS Expert) Location: Hyderabad, India Experience: 5+ Years (Preferably 7+ Years) Employment Type: Full-time Role Overview: We are looking for an exceptionally strong Database Developer with 5+ years of hands-on experience specializing in MySQL database development on Amazon AWS Cloud. The ideal candidate should have deep expertise in high-performance query tuning, handling massive datasets, designing complex summary tables, and implementing scalable database architectures. This role demands a highly analytical and problem-solving mindset, capable of delivering optimized and mission-critical database solutions. Key Responsibilities: • Design, develop, and optimize highly scalable MySQL databases on AWS cloud infrastructure. • Expert-level performance tuning of queries, indexes, and stored procedures for mission-critical applications. • Handle large-scale datasets, ensuring efficient query execution and minimal latency. • Architect and implement summary tables for optimized reporting and analytical performance. • Work closely with software engineers to design efficient data models, indexing strategies, and partitioning techniques. • Ensure high availability, disaster recovery, and fault tolerance of database systems. • Perform root-cause analysis of database bottlenecks and implement robust solutions. • Implement advanced replication strategies, read/write separation, and data sharding for optimal performance. • Work with DevOps teams to automate database monitoring, backups, and performance metrics using AWS tools. • Optimize stored procedures, triggers, and complex database functions to enhance system efficiency. • Ensure best-in-class data security, encryption, and access control policies. Must-Have Skills: • Proven expertise in MySQL query optimization, indexing, and execution plan analysis. • Strong knowledge of AWS RDS, Aurora, and cloud-native database services. • Hands-on experience in tuning high-performance, high-volume transactional databases. • Deep understanding of database partitioning, sharding, caching, and replication strategies. • Experience working with large-scale datasets (millions to billions of records) and ensuring low-latency queries. • Advanced experience in database schema design, normalization, and optimization for high availability. • Proficiency in query profiling, memory management, and database load balancing. • Strong understanding of data warehousing, ETL processes, and analytics-driven data models. • Expertise in troubleshooting slow queries and deadlocks in a production environment. • Proficiency in scripting languages like Python, Shell, or SQL scripting for automation. Preferred Skills: • Experience with big data technologies like Redshift, Snowflake, Hadoop, or Spark. • Exposure to NoSQL databases (MongoDB, Redis) for hybrid data architectures. • Hands-on experience with CI/CD pipelines and DevOps database management. • Experience in predictive analytics and AI-driven data optimizations. Educational Qualification: • Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. Salary & Benefits: • Top-tier compensation package for highly skilled candidates. • Fast-track career growth with opportunities for leadership roles. • Comprehensive health benefits and performance-based bonuses. • Exposure to cutting-edge technologies and large-scale data challenges. If you are a world-class MySQL expert with a passion for solving complex database challenges and optimizing large-scale systems, apply now! Job Types: Full-time, Permanent Pay: ₹634,321.11 - ₹2,091,956.36 per year Benefits: Health insurance Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Language: English (Required) Work Location: In person Expected Start Date: 21/07/2025

Posted 1 week ago

Apply

3.0 years

4 - 6 Lacs

Hyderābād

On-site

DESCRIPTION As a Data Engineer on the Data and AI team, you will design and implement robust data pipelines and infrastructure that power our organization's data-driven decisions and AI capabilities. This role is critical in developing and maintaining our enterprise-scale data processing systems that handle high-volume transactions while ensuring data security, privacy compliance, and optimal performance. You'll be part of a dynamic team that designs and implements comprehensive data solutions, from real-time processing architectures to secure storage solutions and privacy-compliant data access layers. The role involves close collaboration with cross-functional teams, including software development engineers, product managers, and scientists, to create data products that power critical business capabilities. You'll have the opportunity to work with leading technologies in cloud computing, big data processing, and machine learning infrastructure, while contributing to the development of robust data governance frameworks. If you're passionate about solving complex technical challenges in high-scale environments, thrive in a collaborative team setting, and want to make a lasting impact on our organization's data infrastructure, this role offers an exciting opportunity to shape the future of our data and AI capabilities. Key job responsibilities Design and implement ETL/ELT frameworks that handle large-scale data operations, while building reusable components for data ingestion, transformation, and orchestration while ensuring data quality and reliability. Establish and maintain robust data governance standards by implementing comprehensive security controls, access management frameworks, and privacy-compliant architectures that safeguard sensitive information. Drive the implementation of data solutions, both real-time and batch, optimizing them for both analytical workloads and AI/ML applications. Lead technical design reviews and provide mentorship on data engineering best practices, identifying opportunities for architectural improvements and guiding the implementation of enhanced solutions. Build data quality frameworks with robust monitoring systems and validation processes to ensure data accuracy and reliability throughout the data lifecycle. Drive continuous improvement initiatives by evaluating and implementing new technologies and methodologies that enhance data infrastructure capabilities and operational efficiency. A day in the life The day often begins with a team stand-up to align priorities, followed by a review of data pipeline monitoring alarms to address any processing issues and ensure data quality standards are maintained across systems. Throughout the day, you'll find yourself immersed in various technical tasks, including developing and optimizing ETL/ELT processes, implementing data governance controls, and reviewing code for data processing systems. You'll work closely with software engineers, scientists, and product managers, participating in technical design discussions and sharing your expertise in data architecture and engineering best practices. Your responsibilities extend to communicating with non-technical stakeholders, explaining data-related projects and their business impact. You'll also mentor junior engineers and contribute to maintaining comprehensive technical documentation. You'll troubleshoot issues that arise in the data infrastructure, optimize the performance of data pipelines, and ensure data security and compliance with relevant regulations. Staying updated on the latest data engineering technologies and best practices is crucial, as you'll be expected to incorporate new learnings into your work. By the end of a typical day, you'll have advanced key data infrastructure initiatives, solved complex technical challenges, and improved the reliability, efficiency, and security of data systems. Whether it's implementing new data governance controls, optimizing data processing workflows, or enhancing data platforms to support new AI models, your work directly impacts the organization's ability to leverage data for critical business decisions and AI capabilities. If you are not sure that every qualification on the list above describes you exactly, we'd still love to hear from you! At Amazon, we value people with unique backgrounds, experiences, and skillsets. If you’re passionate about this role and want to make an impact on a global scale, please apply! About the team The Data and Artificial Intelligence (AI) team is a new function within Customer Engagement Technology. We own the end-to-end process of defining, building, implementing, and monitoring a comprehensive data strategy. We also develop and apply Generative Artificial Intelligence (GenAI), Machine Learning (ML), Ontology, and Natural Language Processing (NLP) to customer and associate experiences. BASIC QUALIFICATIONS 3+ years of data engineering experience Bachelor’s degree in Computer Science, Engineering, or a related technical discipline PREFERRED QUALIFICATIONS Experience with AWS data services (Redshift, S3, Glue, EMR, Kinesis, Lambda, RDS) and understanding of IAM security frameworks Proficiency in designing and implementing logical data models that drive physical designs Hands-on experience working with large language models, including understanding of data infrastructure requirements for AI model training Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, TS, Hyderabad Database Administration

Posted 1 week ago

Apply

2.0 years

6 - 8 Lacs

Hyderābād

On-site

DESCRIPTION Transportation Financial Systems (TFS) owns the technology components that perform the financial activities for transportation business. These systems are used across all transportation programs and retail expansion to new geographies. TFS systems provide financial document creation & management, expense auditing, accounting, payments and cost allocation functions. Our new generation products are highly scalable and operate at finer level granularity to reconcile every dollar in transportation financial accounts with zero manual entries or corrections. The goal is to develop global product suite for all freight modes touching every single package movement across Amazon. Our mission is to abstract logistics complexities from financial world and financial complexities from logistics world. We are looking for an innovative, hands-on and customer-obsessed candidate for this role. Candidate must be detail oriented, have superior verbal and written communication skills, and should be able to juggle multiple tasks at once. The candidate must be able to make sound judgments and get the right things done. We seek a Business Intelligence (BI) Engineer to strengthen our data-driven decision-making processes. This role requires an individual with excellent statistical and analytical abilities, deep knowledge of business intelligence solutions and have the ability to strongly utilize the GenAI technologies to analyse and solving problem, able to collaborate with product, business & tech teams. The successful candidate will demonstrate the ability to work independently and learn quickly, quick comprehension of Transportation Finance system functions and have passion for data and analytics, be a self-starter comfortable with ambiguity, an ability to work in a fast-paced and entrepreneurial environment, and driven by a desire to innovate Amazon’s approach to this space. Key job responsibilities 1) Translate business problems into analytical requirements and define expected output 2) Develop and implement key performance indicators (KPIs) to measure business performance and product impact. Responsible for deep-dive analysis on key metrics. 3) Create & execute analytical approach to solve the problem inline with stakeholder expectation 4) Strongly leveraging GenAI technologies to solve problems and building solutions 5) Be the domain expert and have knowledge of data availability from various sources. 6) Execute solution with scalable development practices in scripting, write & optimize SQL queries, reporting, data extraction and data visualization. 7) Proactively and independently work with stakeholders to construct use cases and associated standardized outputs for your work 8) Actively manage the timeline and deliverables of projects, focusing on interactions in the team About the team Transportation Financial Systems (TFS) owns the technology components that perform the financial activities for transportation business. These systems are used across all transportation programs and retail expansion to new geographies. TFS systems provide financial document creation & management, expense auditing, accounting, payments and cost allocation functions. Our new generation products are highly scalable and operate at finer level granularity to reconcile every dollar in transportation financial accounts with zero manual entries or corrections. The goal is to develop global product suite for all freight modes touching every single package movement across Amazon. Our mission is to abstract logistics complexities from financial world and financial complexities from logistics world. BASIC QUALIFICATIONS 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with one or more industry analytics visualization tools (e.g. Excel, Tableau, QuickSight, MicroStrategy, PowerBI) and statistical methods (e.g. t-test, Chi-squared) Experience with scripting language (e.g., Python, Java, or R) PREFERRED QUALIFICATIONS Master's degree, or Advanced technical degree Knowledge of data modeling and data pipeline design Experience with statistical analysis, co-relation analysis Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, TS, Hyderabad Software Development

Posted 1 week ago

Apply

3.0 years

6 - 9 Lacs

Hyderābād

On-site

Data Analytics Engineer – CL3 Role Overview : As a Data Analytics Engineer , you will actively engage in your engineering craft, taking a hands-on approach to multiple high-visibility projects. Your expertise will be pivotal in delivering solutions that delight customers and users, while also driving tangible value for Deloitte's business investments. You will leverage your extensive engineering craftmanship across multiple programming languages and modern frameworks, consistently demonstrating your strong track record in delivering high-quality, outcome-focused solutions. The ideal candidate will be a dependable team player, collaborating with cross-functional teams to design, develop, and deploy advanced software solutions. Key Responsibilities : Outcome-Driven Accountability: Embrace and drive a culture of accountability for customer and business outcomes. Develop engineering solutions that solve complex problems with valuable outcomes, ensuring high-quality, lean designs and implementations. Technical Leadership and Advocacy: Serve as the technical advocate for products, ensuring code integrity, feasibility, and alignment with business and customer goals. Lead requirement analysis, component design, development, unit testing, integrations, and support. Engineering Craftsmanship: Maintain accountability for the integrity of code design, implementation, quality, data, and ongoing maintenance and operations. Stay hands-on, self-driven, and continuously learn new approaches, languages, and frameworks. Create technical specifications, and write high-quality, supportable, scalable code ensuring all quality KPIs are met or exceeded. Demonstrate collaborative skills to work effectively with diverse teams. Customer-Centric Engineering: Develop lean engineering solutions through rapid, inexpensive experimentation to solve customer needs. Engage with customers and product teams before, during, and after delivery to ensure the right solution is delivered at the right time. Incremental and Iterative Delivery: Adopt a mindset that favors action and evidence over extensive planning. Utilize a learning-forward approach to navigate complexity and uncertainty, delivering lean, supportable, and maintainable solutions. Cross-Functional Collaboration and Integration: Work collaboratively with empowered, cross-functional teams including product management, experience, and delivery. Integrate diverse perspectives to make well-informed decisions that balance feasibility, viability, usability, and value. Foster a collaborative environment that enhances team synergy and innovation. Advanced Technical Proficiency: Possess deep expertise in modern software engineering practices and principles, including Agile methodologies and DevSecOps to deliver daily product deployments using full automation from code check-in to production with all quality checks through SDLC lifecycle. Strive to be a role model, leveraging these techniques to optimize solutioning and product delivery. Demonstrate understanding of the full lifecycle product development, focusing on continuous improvement, and learning. Domain Expertise: Quickly acquire domain-specific knowledge relevant to the business or product. Translate business/user needs, architectures, and data designs into technical specifications and code. Be a valuable, flexible, and dedicated team member, supportive of teammates, and focused on quality and tech debt payoff. Effective Communication and Influence: Exhibit exceptional communication skills, capable of articulating complex technical concepts clearly and compellingly. Inspire and influence teammates and product teams through well-structured arguments and trade-offs supported by evidence. Create coherent narratives that align technical solutions with business objectives. Engagement and Collaborative Co-Creation: Engage and collaborate with product engineering teams at all organizational levels, including customers as needed. Build and maintain constructive relationships, fostering a culture of co-creation and shared momentum towards achieving product goals. Align diverse perspectives and drive consensus to create feasible solutions. The team : US Deloitte Technology Product Engineering has modernized software and product delivery, creating a scalable, cost-effective model that focuses on value/outcomes that leverages a progressive and responsive talent structure. As Deloitte’s primary internal development team, Product Engineering delivers innovative digital solutions to businesses, service lines, and internal operations with proven bottom-line results and outcomes. It helps power Deloitte’s success. It is the engine that drives Deloitte, serving many of the world’s largest, most respected companies. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. Key Qualifications : § A bachelor’s degree in computer science, software engineering, or a related discipline. An advanced degree (e.g., MS) is preferred but not required. Experience is the most relevant factor. § Strong data engineering foundation with deep understanding of data-structure, algorithms, code instrumentations, etc. § 3+ years proven experience with data ETL and ELT tools (such as ADF, Alteryx, cloud-native tools), data warehousing tools (such as SAP HANA, Snowflake, ADLS, Amazon Redshift, Google Cloud BigQuery). § 3+ years of experience with cloud-native engineering, using FaaS/PaaS/micro-services on cloud hyper-scalers like Azure, AWS, and GCP. § Strong understanding of methodologies & tools like XP, Lean, SAFe, DevSecOps, SRE, ADO, GitHub, SonarQube, etc. § Strong preference will be given to candidates with experience in AI/ML and GenAI. § Excellent interpersonal and organizational skills, with the ability to handle diverse situations, complex projects, and changing priorities, behaving with passion, empathy, and care. How You will Grow: At Deloitte, our professional development plans focus on helping people at every level of their career to identify and use their strengths to do their best work every day and excel in everything they do. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 306373

Posted 1 week ago

Apply

5.0 years

6 - 9 Lacs

Hyderābād

On-site

Data Analytics Engineer – CL4 Role Overview : As a Data Analytics Engineer , you will actively engage in your engineering craft, taking a hands-on approach to multiple high-visibility projects. Your expertise will be pivotal in delivering solutions that delight customers and users, while also driving tangible value for Deloitte's business investments. You will leverage your extensive engineering craftmanship across multiple programming languages and modern frameworks, consistently demonstrating your strong track record in delivering high-quality, outcome-focused solutions. The ideal candidate will be a dependable team player, collaborating with cross-functional teams to design, develop, and deploy advanced software solutions. Key Responsibilities : Outcome-Driven Accountability: Embrace and drive a culture of accountability for customer and business outcomes. Develop engineering solutions that solve complex problems with valuable outcomes, ensuring high-quality, lean designs and implementations. Technical Leadership and Advocacy: Serve as the technical advocate for products, ensuring code integrity, feasibility, and alignment with business and customer goals. Lead requirement analysis, component design, development, unit testing, integrations, and support. Engineering Craftsmanship: Maintain accountability for the integrity of code design, implementation, quality, data, and ongoing maintenance and operations. Stay hands-on, self-driven, and continuously learn new approaches, languages, and frameworks. Create technical specifications, and write high-quality, supportable, scalable code ensuring all quality KPIs are met or exceeded. Demonstrate collaborative skills to work effectively with diverse teams. Customer-Centric Engineering: Develop lean engineering solutions through rapid, inexpensive experimentation to solve customer needs. Engage with customers and product teams before, during, and after delivery to ensure the right solution is delivered at the right time. Incremental and Iterative Delivery: Adopt a mindset that favors action and evidence over extensive planning. Utilize a learning-forward approach to navigate complexity and uncertainty, delivering lean, supportable, and maintainable solutions. Cross-Functional Collaboration and Integration: Work collaboratively with empowered, cross-functional teams including product management, experience, and delivery. Integrate diverse perspectives to make well-informed decisions that balance feasibility, viability, usability, and value. Foster a collaborative environment that enhances team synergy and innovation. Advanced Technical Proficiency: Possess deep expertise in modern software engineering practices and principles, including Agile methodologies and DevSecOps to deliver daily product deployments using full automation from code check-in to production with all quality checks through SDLC lifecycle. Strive to be a role model, leveraging these techniques to optimize solutioning and product delivery. Demonstrate understanding of the full lifecycle product development, focusing on continuous improvement, and learning. Domain Expertise: Quickly acquire domain-specific knowledge relevant to the business or product. Translate business/user needs, architectures, and data designs into technical specifications and code. Be a valuable, flexible, and dedicated team member, supportive of teammates, and focused on quality and tech debt payoff. Effective Communication and Influence: Exhibit exceptional communication skills, capable of articulating complex technical concepts clearly and compellingly. Inspire and influence teammates and product teams through well-structured arguments and trade-offs supported by evidence. Create coherent narratives that align technical solutions with business objectives. Engagement and Collaborative Co-Creation: Engage and collaborate with product engineering teams at all organizational levels, including customers as needed. Build and maintain constructive relationships, fostering a culture of co-creation and shared momentum towards achieving product goals. Align diverse perspectives and drive consensus to create feasible solutions. The team : US Deloitte Technology Product Engineering has modernized software and product delivery, creating a scalable, cost-effective model that focuses on value/outcomes that leverages a progressive and responsive talent structure. As Deloitte’s primary internal development team, Product Engineering delivers innovative digital solutions to businesses, service lines, and internal operations with proven bottom-line results and outcomes. It helps power Deloitte’s success. It is the engine that drives Deloitte, serving many of the world’s largest, most respected companies. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. Key Qualifications : § A bachelor’s degree in computer science, software engineering, or a related discipline. An advanced degree (e.g., MS) is preferred but not required. Experience is the most relevant factor. § Strong data engineering foundation with deep understanding of data-structure, algorithms, code instrumentations, etc. § 5+ years proven experience with data ETL and ELT tools (such as ADF, Alteryx, cloud-native tools), data warehousing tools (such as SAP HANA, Snowflake, ADLS, Amazon Redshift, Google Cloud BigQuery). § 5+ years of experience with cloud-native engineering, using FaaS/PaaS/micro-services on cloud hyper-scalers like Azure, AWS, and GCP. § Strong understanding of methodologies & tools like, XP, Lean, SAFe, DevSecOps, SRE, ADO, GitHub, SonarQube, etc. § Strong preference will be given to candidates with experience in AI/ML and GenAI. § Excellent interpersonal and organizational skills, with the ability to handle diverse situations, complex projects, and changing priorities, behaving with passion, empathy, and care. How You will Grow: At Deloitte, our professional development plans focus on helping people at every level of their career to identify and use their strengths to do their best work every day and excel in everything they do. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 306372

Posted 1 week ago

Apply

0 years

6 - 10 Lacs

Hyderābād

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let’s do this. Let’s change the world. In this vital role you will play a key role in successfully leading the engagement model between Amgen's Technology organization and Global Commercial Operations. Collaborate with G&A (Finance, HR, Legal, IT etc.) Business SMEs, Data Engineers, Data Scientists and Product Managers to lead business analysis activities, ensuring alignment with engineering and product goals on the Data & AI Product Teams Become a G&A (Finance, HR, Legal, IT etc.) domain authority in Data & AI technology capabilities by researching, deploying, and sustaining features built according to Amgen’s Quality System Lead the voice of the customer assessment to define business processes and product needs Work with Product Managers and customers to define scope and value for new developments Collaborate with Engineering and Product Management to prioritize release scopes and refine the Product backlog Ensure non-functional requirements are included and prioritized in the Product and Release Backlogs Facilitate the breakdown of Epics into Features and Sprint-Sized User Stories and participate in backlog reviews with the development team Clearly express features in User Stories/requirements so all team members and partners understand how they fit into the product backlog Ensure Acceptance Criteria and Definition of Done are well-defined Work closely with Business SME’s, Data Scientists, ML Engineers to understand the requirements around Data product requirements, KPI’s etc. Analyzing the source systems and create the STTM documents. Develop and implement effective product demonstrations for internal and external partners Maintain accurate documentation of configurations, processes, and changes Understand end-to-end data pipeline design and dataflow Apply knowledge of data structures to diagnose data issues for resolution by data engineering team What we expect of you We are all different, yet we all use our unique contributions to serve patients. We are seeking a highly skilled and experienced Principal IS Business Analyst with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders with these qualifications. Basic Qualifications: 12 to 17 years of experience in G&A (Finance, HR, Legal, IT etc.) Information Systems Mandatory work experience in acting as a business analyst in DWH, Data product building, BI & Analytics Applications. Experience in Analyzing the requirements of BI, AI & Analytics applications and working with Data Source SME, Data Owners to identify the data sources and data flows Experience with writing user requirements and acceptance criteria Affinity to work in a DevOps environment and Agile mind set Ability to work in a team environment, effectively interacting with others Ability to meet deadlines and schedules and be accountable Preferred Qualifications: Must-Have Skills Excellent problem-solving skills and a passion for solving complex challenges in for AI-driven technologies Experience with Agile software development methodologies (Scrum) Superb communication skills and the ability to work with senior leadership with confidence and clarity Has experience with writing user requirements and acceptance criteria in agile project management systems such as JIRA Experience in managing product features for PI planning and developing product roadmaps and user journeys Good-to-Have Skills: Demonstrated expertise in data and analytics and related technology concepts Understanding of data and analytics software systems strategy, governance, and infrastructure Familiarity with low-code, no-code test automation software Technical thought leadership Able to communicate technical or complex subject matters in business terms Jira Align experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Soft Skills: Able to work under minimal supervision Excellent analytical and gap/fit assessment skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Technical Skills: Experience with cloud-based data technologies (e.g., Databricks, Redshift, S3 buckets) AWS (similar cloud-based platforms) Experience with design patterns, data structures, test-driven development Knowledge of NLP techniques for text analysis and sentiment analysis What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

0 years

4 - 10 Lacs

Hyderābād

On-site

DESCRIPTION At Amazon, we strive to be the most innovative and customer centric company on the planet. Come work with us to develop innovative products, tools and research driven solutions in a fast-paced environment by collaborating with smart and passionate leaders, program managers and software developers. This role is based out of our Hyderabad corporate office and is for an passionate, dynamic, analytical, innovative, hands-on, and customer-obsessed Business Analyst. Your team will be comprised of Business Analysts, Data Engineers, Business Intelligence Engineers based in Hyderabad, Europe and the US. Key job responsibilities The ideal candidate will have experience working with large datasets and distributed computing technologies. The candidate relishes working with large volumes of data, enjoys the challenge of highly complex technical contexts, and, is passionate about data and analytics. He/she should be an expert with data modeling, ETL design and business intelligence tools, has hand-on knowledge on columnar databases such as Redshift and other related AWS technologies. He/she passionately partners with the customers to identify strategic opportunities in the field of data analysis & engineering. He/she should be a self-starter, comfortable with ambiguity, able to think big (while paying careful attention to detail) and enjoys working in a fast-paced team that continuously learns and evolves on a day to day basis. A day in the life Key job responsibilities: This role primarily focuses on deep-dives, creating dashboards for the business, working with different teams to develop and track metrics and bridges. Design, develop and maintain scalable, automated, user-friendly systems, reports, dashboards, etc. that will support our analytical and business needs In-depth research of drivers of the Localization business Analyze key metrics to uncover trends and root causes of issues Suggest and build new metrics and analysis that enable better perspective on business Capture the right metrics to influence stakeholders and measure success Develop domain expertise and apply to operational problems to find solution Work across teams with different stakeholders to prioritize and deliver data and reporting Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation Maintain BI architecture including our AWS account, database and various analytics tools. BASIC QUALIFICATIONS SQL mastery is a must Some scripting knowledge (python, R, scala) Stakeholder management Dashboarding (Excel, Quicksight, Power BI) Data analysis and statistics KPI design PREFERRED QUALIFICATIONS Power BI and Power Pivot in Excel AWS fundamentals (IAM, S3, ..) Python Apache Spark / Scala Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, TS, Hyderabad International Technology Business Intelligence

Posted 1 week ago

Apply

2.0 years

0 Lacs

Hyderābād

On-site

DESCRIPTION When you attract people who have the DNA of pioneers and the DNA of explorers, you build a company of like-minded people who want to invent. And that’s what they think about when they get up in the morning: how are we going to work backwards from customers and build a great service or a great product” – Jeff Bezos Amazon.com’s success is built on a foundation of customer obsession. Have you ever thought about what it takes to successfully deliver millions of packages to Amazon customers seamlessly every day like a clock work? In order to make that happen, behind those millions of packages, billions of decision gets made by machines and humans. What is the accuracy of customer provided address? Do we know exact location of the address on Map? Is there a safe place? Can we make unattended delivery? Would signature be required? If the address is commercial property? Do we know open business hours of the address? What if customer is not home? Is there an alternate delivery address? Does customer have any special preference? What are other addresses that also have packages to be delivered on the same day? Are we optimizing delivery associate’s route? Does delivery associate know locality well enough? Is there an access code to get inside building? And the list simply goes on. At the core of all of it lies quality of underlying data that can help make those decisions in time. The person in this role will be a strong influencer who will ensure goal alignment with Technology, Operations, and Finance teams. This role will serve as the face of the organization to global stakeholders. This position requires a results-oriented, high-energy, dynamic individual with both stamina and mental quickness to be able to work and thrive in a fast-paced, high-growth global organization. Excellent communication skills and executive presence to get in front of VPs and SVPs across Amazon will be imperative. Key Strategic Objectives: Amazon is seeking an experienced leader to own the vision for quality improvement through global address management programs. As a Business Intelligence Engineer of Amazon last mile quality team, you will be responsible for shaping the strategy and direction of customer-facing products that are core to the customer experience. As a key member of the last mile leadership team, you will continually raise the bar on both quality and performance. You will bring innovation, a strategic perspective, a passionate voice, and an ability to prioritize and execute on a fast-moving set of priorities, competitive pressures, and operational initiatives. You will partner closely with product and technology teams to define and build innovative and delightful experiences for customers. You must be highly analytical, able to work extremely effectively in a matrix organization, and have the ability to break complex problems down into steps that drive product development at Amazon speed. You will set the tempo for defect reduction through continuous improvement and drive accountability across multiple business units in order to deliver large scale high visibility/ high impact projects. You will lead by example to be just as passionate about operational performance and predictability as you will be about all other aspects of customer experience. The successful candidate will be able to: Effectively manage customer expectations and resolve conflicts that balance client and company needs. Develop process to effectively maintain and disseminate project information to stakeholders. Be successful in a delivery focused environment and determining the right processes to make the team successful. This opportunity requires excellent technical, problem solving, and communication skills. The candidate is not just a policy maker/spokesperson but drives to get things done. Possess superior analytical abilities and judgment. Use quantitative and qualitative data to prioritize and influence, show creativity, experimentation and innovation, and drive projects with urgency in this fast-paced environment. Partner with key stakeholders to develop the vision and strategy for customer experience on our platforms. Influence product roadmaps based on this strategy along with your teams. Support the scalable growth of the company by developing and enabling the success of the Operations leadership team. Serve as a role model for Amazon Leadership Principles inside and outside the organization Actively seek to implement and distribute best practices across the operation BASIC QUALIFICATIONS 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with scripting language (e.g., Python, Java, or R) Experience building and maintaining basic data artifacts (e.g., ETL, data models, queries) Experience applying basic statistical methods (e.g. regression) to difficult business problems Experience gathering business requirements, using industry standard business intelligence tool(s) to extract data, formulate metrics and build reports Track record of generating key business insights and collaborating with stakeholders PREFERRED QUALIFICATIONS Knowledge of how to improve code quality and optimizes BI processes (e.g. speed, cost, reliability) Knowledge of data modeling and data pipeline design Experience in designing and implementing custom reporting systems using automation tools Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, TS, Hyderabad Business Intelligence

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderābād

On-site

DESCRIPTION The Amazon Web Services Professional Services (ProServe) team is seeking a skilled Delivery Consultant to join our team at Amazon Web Services (AWS). In this role, you'll work closely with customers to design, implement, and manage AWS solutions that meet their technical requirements and business objectives. You'll be a key player in driving customer success through their cloud journey, providing technical expertise and best practices throughout the project lifecycle. AWS Global Services includes experts from across AWS who help our customers design, build, operate, and secure their cloud environments. Customers innovate with AWS Professional Services, upskill with AWS Training and Certification, optimize with AWS Support and Managed Services, and meet objectives with AWS Security Assurance Services. Our expertise and emerging technologies include AWS Partners, AWS Sovereign Cloud, AWS International Product, and the Generative AI Innovation Center. You’ll join a diverse team of technical experts in dozens of countries who help customers achieve more with the AWS cloud. Possessing a deep understanding of AWS products and services, as a Delivery Consultant you will be proficient in architecting complex, scalable, and secure solutions tailored to meet the specific needs of each customer. You’ll work closely with stakeholders to gather requirements, assess current infrastructure, and propose effective migration strategies to AWS. As trusted advisors to our customers, providing guidance on industry trends, emerging technologies, and innovative solutions, you will be responsible for leading the implementation process, ensuring adherence to best practices, optimizing performance, and managing risks throughout the project. The AWS Professional Services organization is a global team of experts that help customers realize their desired business outcomes when using the AWS Cloud. We work together with customer teams and the AWS Partner Network (APN) to execute enterprise cloud computing initiatives. Our team provides assistance through a collection of offerings which help customers achieve specific outcomes related to enterprise cloud adoption. We also deliver focused guidance through our global specialty practices, which cover a variety of solutions, technologies, and industries. 10034 Key job responsibilities As an experienced technology professional, you will be responsible for: Designing and implementing complex, scalable, and secure AWS solutions tailored to customer needs Providing technical guidance and troubleshooting support throughout project delivery Collaborating with stakeholders to gather requirements and propose effective migration strategies Acting as a trusted advisor to customers on industry trends and emerging technologies Sharing knowledge within the organization through mentoring, training, and creating reusable artifacts About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. BASIC QUALIFICATIONS 3+ years of experience in cloud architecture and implementation, Bachelor's degree in Computer Science, Engineering, related field, or equivalent experience Knowledge of the primary aws services (ec2, elb, rds, route53 & s3) Experience in database development/migration using PL/SQL or PL/pgSQL Hands-on experience on Oracle PostgreSQL PREFERRED QUALIFICATIONS AWS experience preferred, with proficiency in a wide range of AWS services (e.g., EC2, S3, RDS, Lambda, IAM, VPC, CloudFormation) AWS Professional level certifications (e.g., Solutions Architect Professional, DevOps Engineer Professional) preferred Experience with automation and scripting (e.g., Terraform, Python) Knowledge of security and compliance standards (e.g., HIPAA, GDPR) Strong communication skills with the ability to explain technical concepts to both technical and non-technical audiences Experience in non-relational databases – DynamoDB, Mongo, etc. Experience in MPP data warehouse solutions such as Redshift, Greenplum, Snowflake etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, TS, Hyderabad IND, KA, Bangalore IND, MH, Maharashtra IND, HR, Gurugram Solutions Architect

Posted 1 week ago

Apply

1.0 years

4 - 10 Lacs

Hyderābād

Remote

DESCRIPTION Want to join the Earth’s most customer centric company? Do you like to dive deep to understand problems? Are you someone who likes to challenge Status Quo? Do you strive to excel at goals assigned to you? If yes, we have opportunities for you. Global Operations – Artificial Intelligence (GO-AI) at Amazon is looking to hire candidates who can excel in a fast-paced dynamic environment. Are you somebody that likes to use and analyze big data to drive business decisions? Do you enjoy converting data into insights that will be used to enhance customer decisions worldwide for business leaders? Do you want to be part of the data team which measures the pulse of innovative machine vision-based projects? If your answer is yes, join our team. GO-AI is looking for a motivated individual with strong skills and experience in resource utilization planning, process optimization and execution of scalable and robust operational mechanisms, to join the GO-AI Ops DnA team. In this position you will be responsible for supporting our sites to build solutions for the rapidly expanding GO-AI team. The role requires the ability to work with a variety of key stakeholders across job functions with multiple sites. We are looking for an entrepreneurial and analytical program manager, who is passionate about their work, understands how to manage service levels across multiple skills/programs, and who is willing to move fast and experiment often. Key job responsibilities Design and develop highly available dashboards and metrics using SQL and Excel/Tableau Execute high priority (i.e. cross functional, high impact) projects to create robust, scalable analytics solutions and frameworks with the help of Analytics/BIE managers Work closely with internal stakeholders such as business teams, engineering teams, and partner teams and align them with respect to your focus area Creates and maintains comprehensive business documentation including user stories, acceptance criteria, and process flows that help the BIE understand the context for developing ETL processes and visualization solutions. Performs user acceptance testing and business validation of delivered dashboards and reports, ensuring that BIE-created solutions meet actual operational needs and can be effectively utilized by site managers and operations teams. Monitors business performance metrics and operational KPIs to proactively identify emerging analytical requirements, working with BIEs to rapidly develop solutions that address real-time operational challenges in the dynamic AI-enhanced fulfillment environment. About the team The Global Operations – Artificial Intelligence (GO-AI) team remotely handles exceptions in the Amazon Robotic Fulfillment Centers Globally. GO-AI seeks to complement automated vision based decision-making technologies by providing remote human support for the subset of tasks which require higher cognitive ability and cannot be processed through automated decision making with high confidence. This team provides end-to-end solutions through inbuilt competencies of Operations and strong central specialized teams to deliver programs at Amazon scale. It is operating multiple programs including Nike IDS, Proteus, Sparrow and other new initiatives in partnership with global technology and operations teams. BASIC QUALIFICATIONS Experience defining requirements and using data and metrics to draw business insights Knowledge of SQL Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages Knowledge of Python, VBA, Macros, Selenium scripts 1+ year of experience working in Analytics / Business Intelligence environment with prior experience of design and execution of analytical projects PREFERRED QUALIFICATIONS Experience in using AI tools Experience in Amazon Redshift and other AWS technologies for large datasets Analytical mindset and ability to see the big picture and influence others Detail-oriented and must have an aptitude for solving unstructured problems. The role will require the ability to extract data from various sources and to design/construct/execute complex analyses to finally come up with data/reports that help solve the business problem Good oral, written and presentation skills combined with the ability to be part of group discussions and explaining complex solutions Ability to apply analytical, computer, statistical and quantitative problem solving skills is required Ability to work effectively in a multi-task, high volume environment Ability to be adaptable and flexible in responding to deadlines and workflow fluctuations Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, TS, Hyderabad IND, TS, Virtual Corporate Operations

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

WorkMode :Hybrid Work Location : Chennai / Hyderabad / Bangalore / Pune / mumbai / gurgaon Work Timing : 2 PM to 11 PM Primary : Data Engineer AWS Data Engineer - AWS Glue, Amazon Redshift, S3 ETL Process , SQl, Databricks JD Examining the business needs to determine the testing technique by automation testing. Maintenance of present regression suites and test scripts is an important responsibility of the tester. The testers must attend agile meetings for backlog refinement, sprint planning, and daily scrum meetings. Testers to execute regression suites for better results. Must provide results to developers, project managers, stakeholders, and manual testers. Responsibility AWS Data Engineer Design and build scalable data pipelines using AWS services like AWS Glue, Amazon Redshift, and S3. Develop efficient ETL processes for data extraction, transformation, and loading into data warehouses and lakes. Create and manage applications using Python, SQL, Databricks, and various AWS technologies. Automate repetitive tasks and build reusable frameworks to improve efficiency.

Posted 1 week ago

Apply

6.0 years

3 - 8 Lacs

Cochin

On-site

Responsibilities: Identify automation opportunities within the system and lead discussions with stakeholders and clients. Conduct multiple Proof of Concepts (PoC’s) on Tosca's capabilities and present demos to stakeholders and clients. Review software requirements and prepare test scenarios. Collaborate with QA Engineers to develop effective strategies, test plans, and test cases. Develop and execute both automated and manual test cases and scripts. Analyze test results for database impacts, errors, bugs, and usability issues. Report bugs and errors to development teams. Assist in troubleshooting issues. Work with cross-functional teams to ensure quality throughout the software development lifecycle. Regularly update and maintain test documentation, ensuring accuracy and completeness for all testing phases. Monitor, track, and report on testing activities, progress, and outcomes to stakeholders, providing insights and recommendations for improvements. Provide expertise and guidance in quality assurance best practices, continually improving testing methodologies and processes. Qualifications: Bachelor's degree or equivalent experience. 6-8 years of professional experience in software quality assurance with a focus on Tosca automation. 2-3+ years of experience as an Automation Lead. Proficiency in the Tosca tool suite, including Tosca Commander and DEX. Strong leadership and communication skills to effectively lead a team and collaborate with stakeholders. Experience in creating and maintaining test automation frameworks and scripts using Tosca. Solid understanding of agile methodologies and the software development lifecycle (SDLC). Ability to analyze and interpret complex technical information and communicate it effectively to non-technical stakeholders. Experience in mobile, mainframe, API, and desktop-based automation. Test plan and test strategy creation. Mandatory certification: Automation Specialist (AS1, AS2, AE1, TDS1). Experience integrating Tosca with CI/CD pipelines and DEX Server for continuous testing. Familiarity with other automation testing tools and frameworks. Up-to-date knowledge of software test design and testing methodologies. Familiarity with agile frameworks. Ability to develop test strategies and write test cases. Ability to document and troubleshoot errors. Working knowledge of test management software. Industry experience with a healthcare insurance company. Experience testing both web and mobile applications. Excellent communication and critical thinking skills. Good organizational skills and a detail-oriented mindset. Analytical mind and problem-solving aptitude. Technical Skills: ETL testing tools: Tableau, Cognos, Informatica, DataStage, MuleSoft, Power BI, DataBricks. Test management tools: Spira, qTest, TestRail, HP ALM, and JIRA. Automation testing tools: Tosca and Selenium. Programming languages: Java, Python. Databases: Oracle, AWS RedShift. IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide . Learn more at https://jobs.iqvia.com

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurgaon

On-site

Acuity Knowledge Partners (Acuity) is a leading provider of bespoke research, analytics and technology solutions to the financial services sector, including asset managers, corporate and investment banks, private equity and venture capital firms, hedge funds and consulting firms. Its global network of over 6,000 analysts and industry experts, combined with proprietary technology, supports more than 600 financial institutions and consulting companies to operate more efficiently and unlock their human capital, driving revenue higher and transforming operations. Acuity is headquartered in London and operates from 10 locations worldwide. The company fosters a diverse, equitable and inclusive work environment, nurturing talent, regardless of race, gender, ethnicity or sexual orientation. Acuity was established as a separate business from Moody’s Corporation in 2019, following its acquisition by Equistone Partners Europe (Equistone). In January 2023, funds advised by global private equity firm Permira acquired a majority stake in the business from Equistone, which remains invested as a minority shareholder. For more information, visit www.acuitykp.com Position Title- Associate Director (Senior Architect – Data) Department-IT Location- Gurgaon/ Bangalore Job Summary The Enterprise Data Architect will enhance the company's strategic use of data by designing, developing, and implementing data models for enterprise applications and systems at conceptual, logical, business area, and application layers. This role advocates data modeling methodologies and best practices. We seek a skilled Data Architect with deep knowledge of data architecture principles, extensive data modeling experience, and the ability to create scalable data solutions. Responsibilities include developing and maintaining enterprise data architecture, ensuring data integrity, interoperability, security, and availability, with a focus on ongoing digital transformation projects. Key Responsibilities 1. Strategy & Planning o Develop and deliver long-term strategic goals for data architecture vision and standards in conjunction with data users, department managers, clients, and other key stakeholders. o Create short-term tactical solutions to achieve long-term objectives and an overall data management roadmap. o Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity. o Establish methods and procedures for tracking data quality, completeness, redundancy, and improvement. o Conduct data capacity planning, life cycle, duration, usage requirements, feasibility studies, and other tasks. o Create strategies and plans for data security, backup, disaster recovery, business continuity, and archiving. o Ensure that data strategies and architectures are aligned with regulatory compliance. o Develop a comprehensive data strategy in collaboration with different stakeholders that aligns with the transformational projects’ goals. o Ensure effective data management throughout the project lifecycle. 2. Acquisition & Deployment o Ensure the success of enterprise-level application rollouts (e.g. ERP, CRM, HCM, FP&A, etc.) Liaise with vendors and service providers to select the products or services that best meet company goals 3. Operational Management o Assess and determine governance, stewardship, and frameworks for managing data across the organization. o Develop and promote data management methodologies and standards. o Document information products from business processes and create data entities o Create entity relationship diagrams to show the digital thread across the value streams and enterprise o Create data normalization across all systems and data base to ensure there is common definition of data entities across the enterprise o Document enterprise reporting needs develop the data strategy to enable single source of truth for all reporting data o Address the regulatory compliance requirements of each country and ensure our data is secure and compliant o Select and implement the appropriate tools, software, applications, and systems to support data technology goals. o Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. o Collaborate with project managers and business unit leaders for all projects involving enterprise data. o Address data-related problems regarding systems integration, compatibility, and multiple-platform integration. o Act as a leader and advocate of data management, including coaching, training, and career development to staff. o Develop and implement key components as needed to create testing criteria to guarantee the fidelity and performance of data architecture. o Document the data architecture and environment to maintain a current and accurate view of the larger data picture. o Identify and develop opportunities for data reuse, migration, or retirement. 4. Data Architecture Design: o Develop and maintain the enterprise data architecture, including data models, databases, data warehouses, and data lakes. o Design and implement scalable, high-performance data solutions that meet business requirements. 5. Data Governance: o Establish and enforce data governance policies and procedures as agreed with stakeholders. o Maintain data integrity, quality, and security within Finance, HR and other such enterprise systems. 6. Data Migration: o Oversee the data migration process from legacy systems to the new systems being put in place. o Define & Manage data mappings, cleansing, transformation, and validation to ensure accuracy and completeness. 7. Master Data Management: o Devise processes to manage master data (e.g., customer, vendor, product information) to ensure consistency and accuracy across enterprise systems and business processes. o Provide data management (create, update and delimit) methods to ensure master data is governed 8. Stakeholder Collaboration: o Collaborate with various stakeholders, including business users, other system vendors, and stakeholders to understand data requirements. o Ensure the enterprise system meets the organization's data needs. 9. Training and Support: o Provide training and support to end-users on data entry, retrieval, and reporting within the candidate enterprise systems. o Promote user adoption and proper use of data. 10 Data Quality Assurance: o Implement data quality assurance measures to identify and correct data issues. o Ensure the Oracle Fusion and other enterprise systems contain reliable and up-to-date information. 11. Reporting and Analytics: o Facilitate the development of reporting and analytics capabilities within the Oracle Fusion and other systems o Enable data-driven decision-making through robust data analysis. 1. Continuous Improvement: o Continuously monitor and improve data processes and the Oracle Fusion and other system's data capabilities. o Leverage new technologies for enhanced data management to support evolving business needs. Technology and Tools: Oracle Fusion Cloud Data modeling tools (e.g., ER/Studio, ERwin) ETL tools (e.g., Informatica, Talend, Azure Data Factory) Data Pipelines: Understanding of data pipeline tools like Apache Airflow and AWS Glue. Database management systems: Oracle Database, MySQL, SQL Server, PostgreSQL, MongoDB, Cassandra, Couchbase, Redis, Hadoop, Apache Spark, Amazon RDS, Google BigQuery, Microsoft Azure SQL Database, Neo4j, OrientDB, Memcached) Data governance tools (e.g., Collibra, Informatica Axon, Oracle EDM, Oracle MDM) Reporting and analytics tools (e.g., Oracle Analytics Cloud, Power BI, Tableau, Oracle BIP) Hyperscalers / Cloud platforms (e.g., AWS, Azure) Big Data Technologies such as Hadoop, HDFS, MapReduce, and Spark Cloud Platforms such as Amazon Web Services, including RDS, Redshift, and S3, Microsoft Azure services like Azure SQL Database and Cosmos DB and experience in Google Cloud Platform services such as BigQuery and Cloud Storage. Programming Languages: (e.g. using Java, J2EE, EJB, .NET, WebSphere, etc.) o SQL: Strong SQL skills for querying and managing databases. Python: Proficiency in Python for data manipulation and analysis. Java: Knowledge of Java for building data-driven applications. Data Security and Protocols: Understanding of data security protocols and compliance standards. Key Competencies Qualifications: Education: o Bachelor’s degree in computer science, Information Technology, or a related field. Master’s degree preferred. Experience: 10+ years overall and at least 7 years of experience in data architecture, data modeling, and database design. Proven experience with data warehousing, data lakes, and big data technologies. Expertise in SQL and experience with NoSQL databases. Experience with cloud platforms (e.g., AWS, Azure) and related data services. Experience with Oracle Fusion or similar ERP systems is highly desirable. Skills: Strong understanding of data governance and data security best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work effectively in a collaborative team environment. Leadership experience with a track record of mentoring and developing team members. Excellent in documentation and presentations. Good knowledge of applicable data privacy practices and laws. Certifications: Relevant certifications (e.g., Certified Data Management Professional, AWS Certified Big Data – Specialty) are a plus. Behavioral A self-starter, an excellent planner and executor and above all, a good team player Excellent communication skills and inter-personal skills are a must Must possess organizational skills, including multi-task capability, priority setting and meeting deadlines Ability to build collaborative relationships and effectively leverage networks to mobilize resources Initiative to learn business domain is highly desirable Likes dynamic and constantly evolving environment and requirements

Posted 1 week ago

Apply

8.0 years

30 - 38 Lacs

Gurgaon

Remote

Role: AWS Data Engineer Location: Gurugram Mode: Hybrid Type: Permanent Job Description: We are seeking a talented and motivated Data Engineer with requisite years of hands-on experience to join our growing data team. The ideal candidate will have experience working with large datasets, building data pipelines, and utilizing AWS public cloud services to support the design, development, and maintenance of scalable data architectures. This is an excellent opportunity for individuals who are passionate about data engineering and cloud technologies and want to make an impact in a dynamic and innovative environment. Key Responsibilities: Data Pipeline Development: Design, develop, and optimize end-to-end data pipelines for extracting, transforming, and loading (ETL) large volumes of data from diverse sources into data warehouses or lakes. Cloud Infrastructure Management: Implement and manage data processing and storage solutions in AWS (Amazon Web Services) using services like S3, Redshift, Lambda, Glue, Kinesis, and others. Data Modeling: Collaborate with data scientists, analysts, and business stakeholders to define data requirements and design optimal data models for reporting and analysis. Performance Tuning & Optimization: Identify bottlenecks and optimize query performance, pipeline processes, and cloud resources to ensure cost-effective and scalable data workflows. Automation & Scripting: Develop automated data workflows and scripts to improve operational efficiency using Python, SQL, or other scripting languages. Collaboration & Documentation: Work closely with data analysts, data scientists, and other engineering teams to ensure data availability, integrity, and quality. Document processes, architectures, and solutions clearly. Data Quality & Governance: Ensure the accuracy, consistency, and completeness of data. Implement and maintain data governance policies to ensure compliance and security standards are met. Troubleshooting & Support: Provide ongoing support for data pipelines and troubleshoot issues related to data integration, performance, and system reliability. Qualifications: Essential Skills: Experience: 8+ years of professional experience as a Data Engineer, with a strong background in building and optimizing data pipelines and working with large-scale datasets. AWS Experience: Hands-on experience with AWS cloud services, particularly S3, Lambda, Glue, Redshift, RDS, and EC2. ETL Processes: Strong understanding of ETL concepts, tools, and frameworks. Experience with data integration, cleansing, and transformation. Programming Languages: Proficiency in Python, SQL, and other scripting languages (e.g., Bash, Scala, Java). Data Warehousing: Experience with relational and non-relational databases, including data warehousing solutions like AWS Redshift, Snowflake, or similar platforms. Data Modeling: Experience in designing data models, schema design, and data architecture for analytical systems. Version Control & CI/CD: Familiarity with version control tools (e.g., Git) and CI/CD pipelines. Problem-Solving: Strong troubleshooting skills, with an ability to optimize performance and resolve technical issues across the data pipeline. Desirable Skills: Big Data Technologies: Experience with Hadoop, Spark, or other big data technologies. Containerization & Orchestration: Knowledge of Docker, Kubernetes, or similar containerization/orchestration technologies. Data Security: Experience implementing security best practices in the cloud and managing data privacy requirements. Data Streaming: Familiarity with data streaming technologies such as AWS Kinesis or Apache Kafka. Business Intelligence Tools: Experience with BI tools (Tableau, Quicksight) for visualization and reporting. Agile Methodology: Familiarity with Agile development practices and tools (Jira, Trello, etc.) Job Type: Permanent Pay: ₹3,000,000.00 - ₹3,800,000.00 per year Benefits: Work from home Schedule: Day shift Monday to Friday Experience: Python: 3 years (Required) Data Engineering: 6 years (Required) Batch Technologies Hadoop, Hive, Athena, Presto, Spark: 4 years (Required) SQL / Queries: 3 years (Required) AWS Elastic MapReduce (EMR): 4 years (Required) ETL/ data pipeline implementations: 3 years (Required) AWS CDK, Cloud-formation, Lambda, Step-function: 4 years (Required) Athena: 3 years (Required) AWS Glue Catalog: 2 years (Required) CI/CD: GitHub Actions: 2 years (Required) Work Location: In person

Posted 1 week ago

Apply

20.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Description Over the past 20 years Amazon has earned the trust of over 300 million customers worldwide by providing unprecedented convenience, selection and value on Amazon.com. By deploying Amazon Pay’s products and services, merchants make it easy for these millions of customers to safely purchase from their third party sites using the information already stored in their Amazon account. In this role, you will lead Data Engineering efforts to drive automation for Amazon Pay organization. You will be part of the data engineering team that will envision, build and deliver high-performance, and fault-tolerant data pipeliens. As a Data Engineer, you will be working with cross-functional partners from Science, Product, SDEs, Operations and leadership to translate raw data into actionable insights for stakeholders, empowering them to make data-driven decisions. Key job responsibilities Design, implement, and support a platform providing ad-hoc access to large data sets Interface with other technology teams to extract, transform, and load data from a wide variety of data sources Implement data structures using best practices in data modeling, ETL/ELT processes, and SQL, Redshift, and OLAP technologies Model data and metadata for ad-hoc and pre-built reporting Interface with business customers, gathering requirements and delivering complete reporting solutions Build robust and scalable data integration (ETL) pipelines using SQL, Python and Spark. Build and deliver high quality data sets to support business analyst, data scientists, and customer reporting needs. Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers Basic Qualifications 3+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Preferred Qualifications Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A3011919

Posted 1 week ago

Apply

6.0 years

6 - 9 Lacs

Gurgaon

On-site

DESCRIPTION Want to build at the cutting edge of immersive shopping experiences? The Visual Innovation Team (VIT) is at the center of all advanced visual and immersive content at Amazon. We're pioneering VR and AR shopping, CGI, and GenAI. We are looking for a Design Tecnologist who will help drive innovation in this space who understands technical problems the team may face from an artistic perspective and provide creative technical solutions. This role is for if you want to be a part of: Partnering with world-class creatives and scientists to drive innovation in content creation Developing and expanding Amazon’s VIT Virtual Production workflow. Building one of the largest content libraries on the planet Driving the success and adoption of emerging experiences across Amazon Key job responsibilities We are looking for a Design Technologist with a specialty in workflow automation using novel technologies like Gen-AI and CV. You will prototype and deliver creative solutions to the technical problems related to Amazon visuals. The right person will bring an implicit understanding of the balance needed between design, technology and creative professionals — helping scale video content creation within Amazon by enabling our teams - to work smarter, not harder. Design Technologists in this role will: Act as a bridge between creative and engineering disciplines to solve multi-disciplinary problems Work directly with videographers and studio production to develop semi-automated production workflows Collaborate with other tech artists and engineers to build and maintain a centralized suite of creative workflows and tooling Work with creative leadership to research, prototype and implement the latest industry trends that expand our production capabilities and improve efficiency A day in the life As a Design Technologist a typical day will include but is not limited to coding and development of tools, workflows, and automation to improve the creative crew's experience and increase productivity. This position will be focused on in house video creation, with virtual production and Gen-Ai workflows. You'll collaborate with production teams, observing, empathizing, and prototyping novel solutions. The ideal candidate is observant, creative, curious, and empathetic, understanding that problems often have multiple approaches. BASIC QUALIFICATIONS 6+ years of front-end technologist, engineer, or UX prototyper experience Have coding samples in front end programming languages Have an available online portfolio Experience developing visually polished, engaging, and highly fluid UX prototypes Experience collaborating with UX, Product, and technical partners PREFERRED QUALIFICATIONS Knowledge of databases and AWS database services: ElasticSearch, Redshift, DynamoDB Experience with machine learning (ML) tools and methods Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, HR, Gurgaon Amazon Design

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies