Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 - 7.0 years
0 Lacs
Bengaluru, Karnataka
Remote
Bengaluru, Karnataka, India Department Data Engineering Job posted on Jul 09, 2025 Employment type Full Time About Us MatchMove is a leading embedded finance platform that empowers businesses to embed financial services into their applications. We provide innovative solutions across payments, banking-as-a-service, and spend/send management, enabling our clients to drive growth and enhance customer experiences. Are You The One? As a Technical Lead Engineer - Data , you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS . You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability , while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark . Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization , enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control , and compliance (GDPR, MAS TRM) . Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities Architect scalable, cost-optimized pipelines across real-time and batch paradigms , using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS , with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack : Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum. Expertise in designing data pipelines for real-time, streaming, and batch systems , including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale. Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls , encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain , with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts , data mesh patterns , and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases . Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. MatchMove Culture: We cultivate a dynamic and innovative culture that fuels growth, creativity, and collaboration. Our fast-paced fintech environment thrives on adaptability, agility, and open communication. We focus on employee development, supporting continuous learning and growth through training programs, learning on the job and mentorship. We encourage speaking up, sharing ideas, and taking ownership. Embracing diversity, our team spans across Asia, fostering a rich exchange of perspectives and experiences. Together, we harness the power of fintech and e-commerce to make a meaningful impact on people's lives. Grow with us and shape the future of fintech and e-commerce. Join us and be part of something bigger! Personal Data Protection Act: By submitting your application for this job, you are authorizing MatchMove to: collect and use your personal data, and to disclose such data to any third party with whom MatchMove or any of its related corporation has service arrangements, in each case for all purposes in connection with your job application, and employment with MatchMove; and retain your personal data for one year for consideration of future job opportunities (where applicable).
Posted 2 weeks ago
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
- 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with one or more industry analytics visualization tools (e.g. Excel, Tableau, QuickSight, MicroStrategy, PowerBI) and statistical methods (e.g. t-test, Chi-squared) - Experience with scripting language (e.g., Python, Java, or R) - Experience with building back-end aggregated tables/pipelines using ANDES, AWS Cradle, S3 IN-COBRA (center of business reporting and analytics), the central BIE team for IN-Stores, requires a BIE who will help us make effective decisions based on data from multiple sources, compiling it and triangulating it into a digestible and actionable format and dashboards. This will help us provide historic information on our business metrics, our customer metrics and make effective decisions for future. The BI will create pipelines for reports to analyze data, make sense of the results and be able to explain what it all means to key stakeholders. This individual will analyze large amounts of data, discover and solve real world problems and build metrics and business cases around key performance of the project tez programs. The ideal candidate will use a customer backwards approach in deriving insights and identifying actions we can take to improve the customer experience and conversion for the program. Key job responsibilities • Develop and streamline necessary dashboards and one-off analyses, providing ability to surface business-critical KPIs, monitor the health of metrics and effectively communicate performance. • Partner with stakeholders and other Business Intelligence teams to acquire necessary data for robust analysis. • Convert data into insights including implications and recommendations that are specific and actionable for the Private Brands team and across the business. • Partner with other analysts as well as data engineering and technology teams to support building a best-in-class dashboards and data infrastructure. • Communicate insights using data visualization and presentations to stakeholders • The successful candidate will be an expert with analyzing large data sets and have exemplary communication skills. The candidate will need to be a self-starter, very comfortable with ambiguity in a fast-paced and ever-changing environment, and able to think big while paying careful attention to detail. Master's degree, or Advanced technical degree Knowledge of data modeling and data pipeline design Experience with statistical analysis, co-relation analysis Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 2 weeks ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Crunchyroll Founded by fans, Crunchyroll delivers the art and culture of anime to a passionate community. We super-serve over 100 million anime and manga fans across 200+ countries and territories, and help them connect with the stories and characters they crave. Whether that experience is online or in-person, streaming video, theatrical, games, merchandise, events and more, it’s powered by the anime content we all love. Join our team, and help us shape the future of anime! Who We Are We're a cast of characters working to shine a spotlight on anime. Crunchyroll is an international business focused on creating both online and offline experiences for fans through content (licensed, co-produced, originals, distribution), merchandise, events, gaming, news, and more. Visit our About Us pages for more information about our collection of brands. Location: Hyderabad, India The intersection of media and technology is our sweet spot and we are fortunate to be global office in Hyderabad, India. This office houses many of our corporate functions and cross-functional teams tasked with creating exceptional experiences for our passionate communities. About the Team: The Center for Data and Insights (CDI) is the centralized team of data engineering, BI, analytics, and data science experts, passionate about servicing the organization with certified reports and insights! The mission of the group is to inspire, support, and guide our partners to be data-aware and to build the systems of intelligence to discover insights and act on them. About the Role: We are looking for a Director, CDI Operations who will manage partner relationships, ensure project success, and drive satisfaction. You will report to the Senior Director, Data Analytics. You will be a key point of contact for our growing global organization in our efforts towards Growth and Strategy. Your work also involves identifying opportunities for growth within existing stakeholder relationships. Responsibilities: Stakeholder Relationship Management: Build and maintain engaging, trusting relationships between global team members. Project Management: Oversee project execution, ensuring timelines are met, budgets are observed, and project scope is well-defined. Communication: Act as the primary point of contact, communicating project progress and updates to stakeholders and internal teams. Problem Solving: Identify and resolve challenges that may arise during the project lifecycle, ensuring partner satisfaction. Opportunity Identification: Identify and pursue new business opportunities with existing partners, potentially leading to expanded engagements. Team Leadership: Guide and develop team members involved in stakeholder projects. Partner Onboarding and Education: Ensuring new partners are onboarded and understand the value of the service/product. Global Business Impact: Lead and influence business principles and how they apply to stakeholder engagements. Global team management: Being available during the US/India timezones to collaborate with stakeholders. About You: 12+ years in Partner Relationship Management. 12+ years Project Management: Overseeing project execution, ensuring timelines are met, budgets are followed, and project scope is well-defined. 10+ years within a technical role. This includes Data Analytics, Data Engineering, Data Science, etc. Knowledge on Cloud data warehouses like Redshift, Snowflake, Imply etc. Knowledge about visualization tools like Tableau. Knowledge on large data sets (Terabytes of data/ billions of records). 5+ years of consulting working experience in international environments, with the stature necessary to work as a partner with senior colleagues and clients. 5+ years of Onsite/Offshore Management experience. Experience breaking down and solving problems through quantitative analysis. Knowledge about the Entertainment domain or equivalent B2C industry. Bachelor's degree in Business, Management, Data Science, or a related field A Day in the Life: On a daily basis, partner with the CDI Stakeholders in a structured manner, both verbally and in writing, including colleagues with different perspectives and seniority levels. Collaborate across time zones using relevant digital productivity tools and digital communication tools (e.g., email, Slack, Zoom). Work with offshore and onsite teams, including a 3-4 hour overlap with the US team. Maintain a culture of high-quality output and outstanding customer service. Why you will love working at Crunchyroll In addition to getting to work with fun, passionate and inspired colleagues, you will also enjoy the following benefits and perks: Best-in class medical, dental, and vision private insurance healthcare coverage Access to counseling & mental health sessions 24/7 through our Employee Assistance Program (EAP) Free premium access to Crunchyroll Professional Development Company's Paid Parental Leave up to 26 weeks for birthing parents up to 12 weeks for non-birthing parents Hybrid Work Schedule Paid Time Off Flex Time Off 5 Yasumi Days Half-Day Fridays during the summer Winter Break About Our Values We want to be everything for someone rather than something for everyone and we do this by living and modeling our values in all that we do. We value Courage. We believe that when we overcome fear, we enable our best selves. Curiosity. We are curious, which is the gateway to empathy, inclusion, and understanding. Kaizen. We have a growth mindset committed to constant forward progress. Service. We serve our community with humility, enabling joy and belonging for others. Our commitment to diversity and inclusion Our mission of helping people belong reflects our commitment to diversity & inclusion. It's just the way we do business. We are an equal opportunity employer and value diversity at Crunchyroll. Pursuant to applicable law, we do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Crunchyroll, LLC is an independently operated joint venture between US-based Sony Pictures Entertainment, and Japan's Aniplex, a subsidiary of Sony Music Entertainment (Japan) Inc., both subsidiaries of Tokyo-based Sony Group Corporation. Questions about Crunchyroll’s hiring process? Please check out our Hiring FAQs: https://help.crunchyroll.com/hc/en-us/articles/360040471712-Crunchyroll-Hiring-FAQs Please refer to our Candidate Privacy Policy for more information about how we process your personal information, and your data protection rights: https://tbcdn.talentbrew.com/company/22978/v1_0/docs/spe-jobs-privacy-policy-update-for-crpa-dec-21-22.pdf Please beware of recent scams to online job seekers. Those applying to our job openings will only be contacted directly from @crunchyroll.com email account.
Posted 2 weeks ago
6.0 years
0 Lacs
Gandhinagar, Gujarat, India
On-site
Role Overview We are seeking a Lead Data Engineer who will take ownership of our data infrastructure, manage a small data team, and oversee the design and implementation of reporting systems. Team Leadership This role is perfect for someone with strong technical skills in data engineering, who also has experience leading projects and delivering business-critical dashboards and Responsibilities : Lead and mentor a team of data engineers and BI developers. Assign tasks, review code, and ensure timely delivery of Pipeline Management : Design, build, and maintain scalable ETL/ELT pipelines across various data & BI Oversight : Oversee the development and delivery of operational and executive reports. Ensure data accuracy and alignment with business goals. Data Warehousing Architect and optimize data warehouses (e.g., Snowflake, Redshift, BigQuery) to support Analytical Workloads And Real-time Collaboration Work closely with business and analytics teams to understand data needs and translate them Into Technical Governance & Quality Implement standards for data governance, documentation, and quality : Evaluate and integrate new tools for data transformation, visualization (e.g., Tableau, Power BI, Looker), And 6+ years of experience in data engineering, with at least 2 years in a lead role. Strong experience in SQL, Python, and ETL tools (e.g., Airflow, DBT). Experience with BI/reporting tools like Power BI, Tableau, or Looker. Deep understanding of data modeling and warehouse architecture. Familiarity with cloud platforms (AWS, GCP, Azure). Excellent communication and stakeholder management skills. Experience managing or mentoring junior team members. (ref:hirist.tech)
Posted 2 weeks ago
7.0 years
0 Lacs
Gandhinagar, Gujarat, India
On-site
Key Responsibilities Lead and mentor a high-performing data pod composed of data engineers, data analysts, and BI developers. Design, implement, and maintain ETL pipelines and data workflows to support real-time and batch processing. Architect and optimize data warehouses for scale, performance, and security. Perform advanced data analysis and modeling to extract insights and support business decisions. Lead data science initiatives including predictive modeling, NLP, and statistical analysis. Manage and tune relational and non-relational databases (SQL, NoSQL) for availability and performance. Develop Power BI dashboards and reports for stakeholders across departments. Ensure data quality, integrity, and compliance with data governance and security standards. Work with cross-functional teams (product, marketing, ops) to turn data into strategy. Qualifications : : PhD in Data Science, Computer Science, Engineering, Mathematics, or related field. 7+ years of hands-on experience across data engineering, data science, analysis, and database administration. Strong experience with ETL tools (e.g., Airflow, Talend, SSIS) and data warehouses (e.g., Snowflake, Redshift, BigQuery). Proficient in SQL, Python, and Power BI. Familiarity with modern cloud data platforms (AWS/GCP/Azure). Strong understanding of data modeling, data governance, and MLOps practices. Exceptional ability to translate business needs into scalable data solutions. (ref:hirist.tech)
Posted 2 weeks ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Job Description Are you ready to make an impact at DTCC? Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We're committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. Benefits Pay and Benefits: Competitive compensation, including base pay and annual incentive. Comprehensive health and life insurance and well-being benefits, based on location. Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact You Will Have In This Role The Development family is responsible for creating, designing, deploying, and supporting applications, programs, and software solutions. May include research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities related to software products used internally or externally on product platforms supported by the firm. The software development process requires in-depth subject matter expertise in existing and emerging development methodologies, tools, and programming languages. Software Developers work closely with business partners and / or external clients in defining requirements and implementing solutions. The Software Engineering role specializes in planning, documenting technical requirements, crafting, developing, and testing all software systems and applications for the firm. Works closely with architects, product managers, project management, and end-users in the development and improvement of existing software systems and applications, proposing and recommending solutions that solve complex business problems. Your Primary Responsibilities Participate in daily code deploys while working on individual or team projects. Translate business requirements into software designs and implementations. Participate in thorough code reviews with a goal of illustrating quality engineering practices and to produce the highest quality code possible. Build high quality and scalable / performant applications. Understand requirements and translate them into specific application and infrastructure related tasks. Design frameworks that promote concepts of isolation, extensibility, and reusability Supports team in handling client expectations and resolving issues urgently. Support development teams, testing, troubleshooting, and production support. Create applications and construct unit test cases that ensure compliance with functional and non-functional requirements. Work with peers to mature ways of working, continuous integration, and continuous delivery. Aligns risk and control processes into day to day responsibilities to supervise and mitigate risk; brings up appropriately. Qualifications Minimum of 4 years of related experience Bachelor's degree preferred or equivalent experience. Talents Needed For Success 4+ years’ experience in Application Development and system analysis Expert in Java/JEE and Coding standard methodologies Expert knowledge in development concepts. Good design and coding skills in Web Services, Spring/Spring Boot, Soap/Rest APIs, and Java Script Frameworks for modern web applications Solid understanding of HTML, CSS, and modern JavaScript Experience with Angular V15+ and/or React. Experience integrating with database technologies such as Oracle, PostgreSQL, etc. Ability to write quality and self-validating code using unit tests and following TDD. Experience with Agile methodology and ability to collaborate with other team members. Bachelor's degree in technical field or equivalent experience. Nice To Have Experience in developing and using AWS cloud stack (S3, SQS, Redshift, Lambda etc.) is a big plus. Ability to demonstrate DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test-Driven Development to enable the rapid delivery of working code utilizing tools like Jenkins, Cloudbees, Git, etc. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 2 weeks ago
40.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. About The Role Role Description: As a Sr. Associate BI Engineer, you will support the development and delivery of data-driven solutions that enable business insights and operational efficiency. You will work closely with senior BI engineers, analysts, and stakeholders to build dashboards, analyze data, and contribute to the design of scalable reporting systems. This is an ideal role for early-career professionals looking to grow their technical and analytical skills in a collaborative environment. Roles & Responsibilities: Designing and maintaining dashboards and reports using tools like Power BI, Tableau, or Cognos. Perform data analysis to identify trends and support business decisions. Gather BI requirements and translate them into technical specifications. Support data validation, testing, and documentation efforts. Apply best practices in data modeling, visualization, and BI development. Participate in Agile ceremonies and contribute to sprint planning and backlog grooming. Basic Qualifications and Experience: Bachelor's or Master’s degree in Computer Science, IT or related field experience Atleast 5 years of relevant experience. Functional Skills: Exposure to data visualization tools such as Power BI, Tableau, or QuickSight. Proficiency in SQL and scripting languages (e.g., Python) for data processing and analysis Familiarity with data modeling, warehousing, and ETL pipelines Understanding of data structures and reporting concepts Strong analytical and problem-solving skills Good-to-Have Skills: Familiarity with Cloud services like AWS (e.g., Redshift, S3, EC2) Understanding of Agile methodologies (Scrum, SAFe) Knowledge of DevOps, CI/CD practices Familiarity with scientific or healthcare data domains Soft Skills: Strong verbal and written communication skills Willingness to learn and take initiative Ability to work effectively in a team environment Attention to detail and commitment to quality Ability to manage time and prioritize tasks effectively Shift Information: This position may require working a second or third shift based on business needs. Candidates must be willing and able to work during evening or night shifts if required. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 2 weeks ago
40.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. About The Role Role Description: As a BI Analyst in the Business Intelligence, Reporting, and Sensing team, you will play a critical role in transforming data into actionable insights that drive strategic decisions. You will collaborate with cross-functional teams to gather requirements, design analytical solutions, and deliver high-quality dashboards and reports. This role blends technical expertise with business acumen and requires strong communication and problem-solving skills. Roles & Responsibilities: Collaborate with System Architects and Product Managers to manage business analysis activities, ensuring alignment with engineering and product goals. Support Design, development, and maintenance activities of interactive dashboards, reports, and data visualizations using BI tools (e.g., Power BI, Tableau, Cognos). Analyze datasets to identify trends, patterns, and insights that inform business strategy and decision-making. Collaborate with stakeholders across departments to understand data and reporting needs. Translate business requirements into technical specifications and analytical solutions. Work with Data Engineers to ensure data models and pipelines support accurate and reliable reporting. Contribute to data quality and governance initiatives. Document business processes, use cases, and test plans to support development and QA efforts. Participate in Agile ceremonies and contribute to backlog refinement and sprint planning. Basic Qualifications and Experience: Bachelor's or Master’s degree in Computer Science, IT or related field experience Atleast 5 years of experience as Business Analyst or relevant areas. Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Functional Skills: Experience with data visualization tools such as Power BI, Tableau, or QuickSight. Proficiency in SQL and scripting languages (e.g., Python) for data processing and analysis Familiarity with data modeling, warehousing, and ETL pipelines Experience writing user stories and acceptance criteria in Agile tools like JIRA Strong analytical and problem-solving skills Good-to-Have Skills: Experience with AWS services (e.g., Redshift, S3, EC2) Understanding of Agile methodologies (Scrum, SAFe) Knowledge of DevOps, CI/CD practices Familiarity with scientific or healthcare data domains Professional Certifications (please mention if the certification is preferred or mandatory for the role): AWS Developer certification (preferred) SAFe for Teams Certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Shift Information: This position may require working a second or third shift based on business needs. Candidates must be willing and able to work during evening or night shifts if required. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 2 weeks ago
5.0 - 10.0 years
22 - 37 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Inviting applications for the role of Senior Principal Consultant-Data Engineer, AWS! Locations Bangalore, Hyderabad, Kolkata Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Masters Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process
Posted 2 weeks ago
1.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Description Want to join the Earth’s most customer centric company? Do you like to dive deep to understand problems? Are you someone who likes to challenge Status Quo? Do you strive to excel at goals assigned to you? If yes, we have opportunities for you. Global Operations – Artificial Intelligence (GO-AI) at Amazon is looking to hire candidates who can excel in a fast-paced dynamic environment. Are you somebody that likes to use and analyze big data to drive business decisions? Do you enjoy converting data into insights that will be used to enhance customer decisions worldwide for business leaders? Do you want to be part of the data team which measures the pulse of innovative machine vision-based projects? If your answer is yes, join our team. GO-AI is looking for a motivated individual with strong skills and experience in resource utilization planning, process optimization and execution of scalable and robust operational mechanisms, to join the GO-AI Ops DnA team. In this position you will be responsible for supporting our sites to build solutions for the rapidly expanding GO-AI team. The role requires the ability to work with a variety of key stakeholders across job functions with multiple sites. We are looking for an entrepreneurial and analytical program manager, who is passionate about their work, understands how to manage service levels across multiple skills/programs, and who is willing to move fast and experiment often. Key job responsibilities Design and develop highly available dashboards and metrics using SQL and Excel/Tableau Execute high priority (i.e. cross functional, high impact) projects to create robust, scalable analytics solutions and frameworks with the help of Analytics/BIE managers Work closely with internal stakeholders such as business teams, engineering teams, and partner teams and align them with respect to your focus area Creates and maintains comprehensive business documentation including user stories, acceptance criteria, and process flows that help the BIE understand the context for developing ETL processes and visualization solutions. Performs user acceptance testing and business validation of delivered dashboards and reports, ensuring that BIE-created solutions meet actual operational needs and can be effectively utilized by site managers and operations teams. Monitors business performance metrics and operational KPIs to proactively identify emerging analytical requirements, working with BIEs to rapidly develop solutions that address real-time operational challenges in the dynamic AI-enhanced fulfillment environment. About The Team The Global Operations – Artificial Intelligence (GO-AI) team remotely handles exceptions in the Amazon Robotic Fulfillment Centers Globally. GO-AI seeks to complement automated vision based decision-making technologies by providing remote human support for the subset of tasks which require higher cognitive ability and cannot be processed through automated decision making with high confidence. This team provides end-to-end solutions through inbuilt competencies of Operations and strong central specialized teams to deliver programs at Amazon scale. It is operating multiple programs including Nike IDS, Proteus, Sparrow and other new initiatives in partnership with global technology and operations teams. Basic Qualifications Experience defining requirements and using data and metrics to draw business insights Knowledge of SQL Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages Knowledge of Python, VBA, Macros, Selenium scripts 1+ year of experience working in Analytics / Business Intelligence environment with prior experience of design and execution of analytical projects Preferred Qualifications Experience in using AI tools Experience in Amazon Redshift and other AWS technologies for large datasets Analytical mindset and ability to see the big picture and influence others Detail-oriented and must have an aptitude for solving unstructured problems. The role will require the ability to extract data from various sources and to design/construct/execute complex analyses to finally come up with data/reports that help solve the business problem Good oral, written and presentation skills combined with the ability to be part of group discussions and explaining complex solutions Ability to apply analytical, computer, statistical and quantitative problem solving skills is required Ability to work effectively in a multi-task, high volume environment Ability to be adaptable and flexible in responding to deadlines and workflow fluctuations Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A3027310
Posted 2 weeks ago
1.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Description Want to join the Earth’s most customer centric company? Do you like to dive deep to understand problems? Are you someone who likes to challenge Status Quo? Do you strive to excel at goals assigned to you? If yes, we have opportunities for you. Global Operations – Artificial Intelligence (GO-AI) at Amazon is looking to hire candidates who can excel in a fast-paced dynamic environment. Are you somebody that likes to use and analyze big data to drive business decisions? Do you enjoy converting data into insights that will be used to enhance customer decisions worldwide for business leaders? Do you want to be part of the data team which measures the pulse of innovative machine vision-based projects? If your answer is yes, join our team. GO-AI is looking for a motivated individual with strong skills and experience in resource utilization planning, process optimization and execution of scalable and robust operational mechanisms, to join the GO-AI Ops DnA team. In this position you will be responsible for supporting our sites to build solutions for the rapidly expanding GO-AI team. The role requires the ability to work with a variety of key stakeholders across job functions with multiple sites. We are looking for an entrepreneurial and analytical program manager, who is passionate about their work, understands how to manage service levels across multiple skills/programs, and who is willing to move fast and experiment often. Key job responsibilities Design and develop highly available dashboards and metrics using SQL and Excel/Tableau Execute high priority (i.e. cross functional, high impact) projects to create robust, scalable analytics solutions and frameworks with the help of Analytics/BIE managers Work closely with internal stakeholders such as business teams, engineering teams, and partner teams and align them with respect to your focus area Creates and maintains comprehensive business documentation including user stories, acceptance criteria, and process flows that help the BIE understand the context for developing ETL processes and visualization solutions. Performs user acceptance testing and business validation of delivered dashboards and reports, ensuring that BIE-created solutions meet actual operational needs and can be effectively utilized by site managers and operations teams. Monitors business performance metrics and operational KPIs to proactively identify emerging analytical requirements, working with BIEs to rapidly develop solutions that address real-time operational challenges in the dynamic AI-enhanced fulfillment environment. About The Team The Global Operations – Artificial Intelligence (GO-AI) team remotely handles exceptions in the Amazon Robotic Fulfillment Centers Globally. GO-AI seeks to complement automated vision based decision-making technologies by providing remote human support for the subset of tasks which require higher cognitive ability and cannot be processed through automated decision making with high confidence. This team provides end-to-end solutions through inbuilt competencies of Operations and strong central specialized teams to deliver programs at Amazon scale. It is operating multiple programs including Nike IDS, Proteus, Sparrow and other new initiatives in partnership with global technology and operations teams. Basic Qualifications Experience defining requirements and using data and metrics to draw business insights Knowledge of SQL Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages Knowledge of Python, VBA, Macros, Selenium scripts 1+ year of experience working in Analytics / Business Intelligence environment with prior experience of design and execution of analytical projects Preferred Qualifications Experience in using AI tools Experience in Amazon Redshift and other AWS technologies for large datasets Analytical mindset and ability to see the big picture and influence others Detail-oriented and must have an aptitude for solving unstructured problems. The role will require the ability to extract data from various sources and to design/construct/execute complex analyses to finally come up with data/reports that help solve the business problem Good oral, written and presentation skills combined with the ability to be part of group discussions and explaining complex solutions Ability to apply analytical, computer, statistical and quantitative problem solving skills is required Ability to work effectively in a multi-task, high volume environment Ability to be adaptable and flexible in responding to deadlines and workflow fluctuations Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A3027313
Posted 2 weeks ago
5.0 - 8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Informatica IDMC Developer Skills: Informatica Intelligent Data Management Cloud (IDMC/IICS), SQL, AWS, Azure, or GCP, CI/CD pipelines, Snowflake, Redshift, or BigQuery. Experience Required: 5 - 8 Years Job Location: Greater Noida Only Send your CV to Gaurav.2.Kumar@coforge.com We at Coforge are hiring Informatica IDMC Developer with following skillset: Key Responsibilities: Design, develop, and maintain robust ETL pipelines using Informatica IDMC (IICS) . Collaborate with data architects, analysts, and business stakeholders to gather and understand data requirements. Integrate data from diverse sources including databases, APIs, and flat files. Optimize data workflows for performance, scalability, and reliability. Monitor and troubleshoot ETL jobs and resolve data quality issues. Implement data governance and security best practices. Maintain comprehensive documentation of data flows, transformations, and architecture. Participate in code reviews and contribute to continuous improvement initiatives. Required Skills & Qualifications: Strong hands-on experience with Informatica IDMC (IICS) and cloud-based ETL tools. Proficiency in SQL and experience with relational databases such as Oracle, SQL Server, PostgreSQL . Experience working with cloud platforms like AWS, Azure, or GCP . Familiarity with data warehousing concepts and tools such as Snowflake, Redshift, or BigQuery . Excellent problem-solving abilities and strong communication skills. Preferred Qualifications: Experience with CI/CD pipelines and version control systems. Knowledge of data modeling and metadata management. Certification in Informatica or cloud platforms is a plus.
Posted 2 weeks ago
4.0 years
0 Lacs
Delhi, India
On-site
What do you need to know about us? M+C Saatchi Performance is an award-winning global digital media agency, connecting brands to people. We deliver business growth for our clients through effective, measurable, and evolving digital media strategies. Position Title : Analyst- Reporting & QA Department : Reporting & QA Location : [New Delhi / Hybrid options] About the Role: We are looking for a highly skilled Analyst- Reporting & QA with a deep understanding of digital and mobile media to join our Reporting and QA team. This role will focus on enabling our clients to meet their media goals by ensuring data accuracy and delivering actionable insights into media performance through our reporting tools. The ideal candidate will have strong technical skills, be detail-oriented, and have experience in digital/mobile media attribution and reporting. Core Responsibilities: ETL & Data Automation : Use Matillion to streamline data processes, ensuring efficient and reliable data integration across all reporting systems. Data Quality Assurance : Verify and validate data accuracy within Power BI dashboards, proactively identifying and addressing discrepancies to maintain high data integrity. Dashboard Development : Build, maintain, and optimize Power BI dashboards to deliver real-time insights that help clients understand the performance of their digital and mobile media campaigns. Media Performance Insights : Collaborate closely with media teams to interpret data, uncover trends, and provide actionable insights that support clients in optimizing their media investments. Industry Expertise : Apply in-depth knowledge of digital and mobile media, attribution models, and reporting frameworks to deliver valuable perspectives on media performance. Tools & Platforms Expertise : Utilize tools such as GA4, platform reporting systems, first-party data analytics, and mobile measurement partners (MMPs) to support comprehensive media insights for clients. Qualifications and Experience: Education : Bachelor’s degree in Statistics, Data Science, Computer Science, Marketing, or a related field. Experience : 4-6 years in a similar role, with substantial exposure to data analysis, reporting, and the digital/mobile media landscape. Technical Skills : Proficiency in ETL tools (preferably Matillion), Power BI, and data quality control. Industry Knowledge : Strong understanding of digital and mobile media, with familiarity in attribution, reporting practices, and performance metrics. Analytical Skills : Skilled in interpreting complex data, generating actionable insights, and presenting findings effectively to non-technical stakeholders. Communication : Excellent communicator with a proven ability to collaborate effectively across cross-functional teams and with clients. Tools & Platforms : Proficiency in GA4, platform reporting, first-party data analysis, and mobile measurement partners (MMPs). Desired Skills: Background in a media agency environment. Experience with cloud-based data platforms (e.g., AWS, Redshift) preferred. Experience with Power BI is must. Strong collaboration skills and the ability to work independently. What Can you look forward to Being a part of the world’s largest independent advertising holding group. Family Health Insurance Coverage. Flexible Working Hours. Regular events including Reece Lunch & indoor games. Employee Training/Learning Programs About M+C Saatchi Performance M+C Saatchi Performance has pledged its commitment to create a company that values difference, with an inclusive culture. As part of this, M+C Saatchi Performance continues to be an Equal Opportunity Employer which does not and shall not discriminate, celebrates diversity and bases all hiring and promotion decisions solely on merit, without regard for any personal characteristics. All employee information is kept confidential according to General Data Protection Regulation (GDPR).
Posted 2 weeks ago
4.0 years
0 Lacs
Delhi, India
On-site
What do you need to know about us? M+C Saatchi Performance is an award-winning global digital media agency, connecting brands to people. We deliver business growth for our clients through effective, measurable, and evolving digital media strategies. Position Title : Analyst- Reporting & QA Department : Reporting & QA Location : New Delhi - Hybrid About the Role: We are looking for a highly skilled Analyst- Reporting & QA with a deep understanding of digital and mobile media to join our Reporting and QA team. This role will focus on enabling our clients to meet their media goals by ensuring data accuracy and delivering actionable insights into media performance through our reporting tools. The ideal candidate will have strong technical skills, be detail-oriented, and have experience in digital/mobile media attribution and reporting. Core Responsibilities: ETL & Data Automation : Use Matillion to streamline data processes, ensuring efficient and reliable data integration across all reporting systems. Data Quality Assurance : Verify and validate data accuracy within Power BI dashboards, proactively identifying and addressing discrepancies to maintain high data integrity. Dashboard Development : Build, maintain, and optimize Power BI dashboards to deliver real-time insights that help clients understand the performance of their digital and mobile media campaigns. Media Performance Insights : Collaborate closely with media teams to interpret data, uncover trends, and provide actionable insights that support clients in optimizing their media investments. Industry Expertise : Apply in-depth knowledge of digital and mobile media, attribution models, and reporting frameworks to deliver valuable perspectives on media performance. Tools & Platforms Expertise : Utilize tools such as GA4, platform reporting systems, first-party data analytics, and mobile measurement partners (MMPs) to support comprehensive media insights for clients. Qualifications and Experience: Education : Bachelor’s degree in Statistics, Data Science, Computer Science, Marketing, or a related field. Experience : 4-6 years in a similar role, with substantial exposure to data analysis, reporting, and the digital/mobile media landscape. Technical Skills : Proficiency in ETL tools (preferably Matillion), Power BI, and data quality control. Industry Knowledge : Strong understanding of digital and mobile media, with familiarity in attribution, reporting practices, and performance metrics. Analytical Skills : Skilled in interpreting complex data, generating actionable insights, and presenting findings effectively to non-technical stakeholders. Communication : Excellent communicator with a proven ability to collaborate effectively across cross-functional teams and with clients. Tools & Platforms : Proficiency in GA4, platform reporting, first-party data analysis, and mobile measurement partners (MMPs). Desired Skills: Background in a media agency environment. Experience with cloud-based data platforms (e.g., AWS, Redshift) preferred. Experience with Power BI is must. Strong collaboration skills and the ability to work independently. What Can you look forward to Being a part of the world’s largest independent advertising holding group. Family Health Insurance Coverage. Flexible Working Hours. Regular events including Reece Lunch & indoor games. Employee Training/Learning Programs About M+C Saatchi Performance M+C Saatchi Performance has pledged its commitment to create a company that values difference, with an inclusive culture. As part of this, M+C Saatchi Performance continues to be an Equal Opportunity Employer which does not and shall not discriminate, celebrates diversity and bases all hiring and promotion decisions solely on merit, without regard for any personal characteristics. All employee information is kept confidential according to General Data Protection Regulation (GDPR).
Posted 2 weeks ago
6.0 - 11.0 years
12 - 22 Lacs
Bengaluru
Work from Office
Our engineering team is looking for a Data Engineer who is very proficient in python, has a very good understanding of AWS cloud computing, ETL Pipelines and demonstrates proficiency with SQL and relational database concepts In this role you will be a very mid-to senior-level individual contributor guiding our migration efforts by serving as a Senior data engineer working closely with the Data architects to evaluate best-fit solutions and processes for our team You will work with the rest of the team as we move away from legacy tech and introduce new tools and ETL pipeline solutions You will collaborate with subject matter experts, data architects, informaticists and data scientists to evolve our current cloud based ETL to the next Generation Responsibilities Independently prototypes/develop data solutions of high complexity to meet the needs of the organization and business customers. Designs proof-of-concept solutions utilizing an advanced understanding of multiple coding languages to meet technical and business requirements, with an ability to perform iterative solution testing to ensure specifications are met. Designs and develops data solutions that enables effective self-service data consumption, and can describe their value to the customer. Collaborates with stakeholders in defining metrics that are impactful to the business. Prioritize efforts based on customer value. Has an in-depth understanding of Agile techniques. Can set expectations for deliverables of high complexity. Can assist in the creation of roadmaps for data solutions. Can turn vague ideas or problems into data product solutions. Influences strategic thinking across the team and the broader organization. Maintains proof-of-concepts and prototype data solutions, and manages any assessment of their viability and scalability, with own team or in partnership with IT. Working with IT, assists in building robust systems focusing on long-term and ongoing maintenance and support. Ensure data solutions include deliverables required to achieve high quality data. Displays a strong understanding of complex multi-tier, multi-platform systems, and applies principles of metadata, lineage, business definitions, compliance, and data security to project work. Has an in-depth understanding of Business Intelligence tools, including visualization and user experience techniques. Can set expectations for deliverables of high complexity. Works with IT to help scale prototypes. Demonstrates a comprehensive understanding of new technologies as needed to progress initiatives. Requirements Expertise in Python programming, with demonstrated real-world experience building out data tools in a Python environment. Expertise in AWS Services , with demonstrated real-world experience building out data tools in a Python environment. Bachelor`s Degree in Computer Science, Computer Engineering, or related discipline preferred. Masters in same or related disciplines strongly preferred. 3+ years experience in coding for data management, data warehousing, or other data environments, including, but not limited to, Python and Spark. Experience with SAS is preferred. 3+ years experience as developer working in an AWS cloud computing environment. 3+ years experience using GIT or Bitbucket. Experience with Redshift, RDS, DynomoDB is preferred. Strong written and oral communication skills are required. Experience in the healthcare industry with healthcare data analytics products Experience with healthcare vocabulary and data standards (OMOP, FHIR) is a plus
Posted 2 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
We are looking for a Test Engineer who will become part of our team building and testing the Creditsafe data. You will be working closely with the database teams and data engineering to build specific systems facilitating the extraction and transformation of Creditsafe data. Based on the test strategy and approach you will develop,enhance and execute tests that add value to Creditsafe data. You will act as a primary source of guidance to Junior Test Engineers and Test Engineers in all areas of data quality. You will contribute to the team using data quality best practices and techniques. You can confidently communicate test results with your team members and stakeholders using evidence and reports. You act as a mentor and coach to the less experienced members of the test team. You will promote and coach leading practices in data test management, design, and implementation. You will be part of an Agile team and will effectively contribute to the ceremonies, acting as the quality specialist within that team. You are an influencer and will provide leadership in defining and implementing agreed standards and will actively promote this within your team and the wider development community. The ideal candidate has extensive experience in mentorship and leading by example and is able to communicate values consistent with the Creditsafe philosophy of engagement. You have critical thinking skills and can diplomatically communicate within and outside their areas of responsibility, challenging assumptions where required. Required Skills Proven working experience as a data test engineer or business data analyst or ETL tester. Technical expertise regarding data models, database design development, data mining and segmentation techniques Strong knowledge of and experience with SQL databases Hands on experience of best engineering practices (handling and logging errors,system monitoring and building human-fault-tolerant applications) Knowledge of statistics and experience using statistical packages for analysing datasets(Excel, SPSS, SAS etc.) is an advantage. Comfortable working with relational databases such as Redshift, Oracle,PostgreSQL, MySQL, and MariaDB (PostgreSQL preferred) Strong analytical skills with the abilityto collect, organise, analyse, and disseminate significant amounts of information with attention to detail and accuracy Adept at queries,report writing and presenting findings. BSin Mathematics, Economics, Computer Science, Information Management or Statistics is desirable but not essential A good understanding of cloud technology, preferably AWS and/orAzure DevOps A practical understanding of programming: JavaScript, Python Excellent communication skills Practical experience of testing in an Agile approach Desirable Skills An understanding of version control systems Practical experience of conducting code reviews Practical experience of pair testing and pair programming Primary Responsibilities Reports to Engineering Lead Work as part of the engineering team in data acquisition Designing and implementing processes and tools to monitor and improve the quality of Creditsafe's data. Developing and executing test plans to verify the accuracy and reliability of data. Working with data analysts and other stakeholders to establish and maintain data governance policies. Identifying and resolving issues with the data, such as errors,inconsistencies, or duplication. Collaborating with other teams, such as data analysts and data scientists to ensure the quality of data used for various projects and initiatives. Providing training and guidance to other team members on data quality best practices and techniques. Monitoring and reporting on key data quality metrics, such as data completeness and accuracy. Continuously improving data quality processes and tools based on feedback and analysis. Work closely with their Agile team to promote a whole team approach to quality Documents approaches and processes that improve the quality effort for use by team members and the wider test function Strong practical knowledge of software testing techniques and the ability to advise on, and select ,the correct technique dependent on the problem at hand Conducts analysis of the teams test approach, taking a proactive role in the formulation of the relevant quality criteria in line with the team goals Work with team members to define standards and processes applicable to their area of responsibility Monitor progress of team deliverables, injecting quality concerns in a timely, effective manner Gain a sufficient understanding of the system architecture to inform their test approach and that of the test engineers Creation and maintenance of concise and accurate defect reports in line with the established defect process Behavioural skills Teamwork – Leads by example in the areas of cooperation, collaboration and partnerships Quality Improvement – Takes the initiative to deliver improvements and results of value Problem Solving - Identifies and prioritises problems and works to deliver workable solutions Seeks feedback from team members and provides feedback to team members. Has an appreciation of others viewpoints, frequently soliciting differing opinions to their own Promotes an inclusive, merit-based approach to differing opinions Autonomy Is able to work independently within the constraints of their Agile team. Is able to determine when issues should be escalated. Takes responsibility and provides rationality for own decisions. Influence Interacts with and influences colleagues in a positive manner. Undertakes supervisory activities. Makes decisions which impact and optimises the work assigned to individuals or projects. Aspires to be regarded as the SME for quality related issues Complexity Is able to grasp complex concepts, form an understanding and explain to other team members. Is able to articulate complex concepts to stakeholders in a non-technical manner Performs a range of work, sometimes complex and non-routine, in a variety of environments. Applies a methodical approach to issue definition and resolution. Business skills Demonstrates an analytical and systematic approach to issue resolution, acting as the primary contact within their team. Takes the initiative in identifying and negotiating appropriate personal development opportunities with less experienced test team members Demonstrates effective communication skills and can vary message presentation dependent on the level of stakeholder Plans, schedules and monitors own work (and that of others) competently within limited deadlines and according to relevant legislation, standards and procedures. Appreciates the wider business context, and how their own role relates to other roles and to the business objectives of CreditSafe. Company Benefits: Competitive Salary Work from Home Pension Medical Insurance Cab facility for Women Dedicated Gaming Area
Posted 2 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Job Title: Database Engineer X 8 Positions Location: Hyderabad, India Salary: Market Rate/Negotiable About us Creditsafe is the most used business data provider in the world, reducing risk and maximizing opportunities for our 110,000 business customers. Our journey began in Oslo, Norway in 1997, where we had a dream of using the then revolutionary internet to deliver instant access company credit reports to small and medium-sized businesses. Creditsafe realized this dream and changed the market for the better for businesses of all sizes. From there, we opened 15 more offices throughout Europe, the USA and Asia. We provide data on more than 300 million companies and provide customer notifications for billions of changes annually. We are a high growth company offering the freedom and flexibility of a start-up type culture due to the continuous innovation and new product development performed, coupled with the stability of being a profitable and growing company! With such a large customer base and breadth of data and analytics technology you will have real opportunities to help companies survive and thrive in challenging times by reducing business risk and choosing trustworthy customers and suppliers. Summary: This is your opportunity to develop your career with an exciting, fast paced and rapidly expanding business, one of the leading providers of Business Intelligence worldwide. As a Database Engineer with excellent database development skills, you will be responsible for developing and maintaining the databases and scripts that power the company’s products and websites, handling large data sets and having more than 20 million hits per day. You will work with your team to deliver work on time, in-line with the business requirements, and to a high level of quality This is your opportunity to develop your career with an exciting, fast paced and rapidly expanding business, one of the leading providers of Business Intelligence worldwide. You will work with your team to deliver work on time, in-line with the business requirements, and to a high level of quality. Primary Responsibilities: · 5+ year’s solid commercial experience of Oracle development under a 10g or 11g environment. · Advanced PL/SQL knowledge required. · ETL skills – Pentaho would be beneficial · Any wider DB experience would be desirable e.g., Redshift, Aurora DB, DynamoDB, MariaDB, MongoDB etc. · Cloud/AWS An interest in learning new technologies. · Experience in tuning Oracle queries in large databases. · Good experience in loading and extracting large data sets. · Experience of working with an Oracle database under a bespoke web development environment. · Analytical and critical thinking skills; agile problem-solving abilities. · Detail oriented, self-motivated, able to work independently with little or no supervision, and is committed to the highest standards of quality for the entire release process. · Excellent written and verbal communication skills. · Attention to detail. · Ability to work in a fast paced / changing environment. · Ability to thrive in a deadline driven, stressful project environment.3+ years of software development experience. Qualifications and Experience · Degree in Computer Science or similar. · Experience with loading data through SSIS. · Experience working on financial and business intelligence projects or in big data environments. · A desire to learn new skills and branch into development using a wide range of alternative technologies. Skills, Knowledge and Abilities · Write code for new development requirements as well as provide bug fixing, support and maintenance of existing code. · Test your code to ensure it functions as per the business requirements, considering the impact of your code on other areas of the solution. · Provide expert advice on performance tuning within Oracle. · Perform large-scale imports and extracts of data. · Assist the business in the collection and documentation of user's requirements where needed, provide estimates and work plans · Create and maintain technical documentation. · Follow all company procedures/standards/processes. · Contribute to architectural design and development making technically sound development recommendations. · Provide support to other staff in the department and act as a mentor to less experienced staff, including through code reviews. · Work as a team player in an agile environment. · Build release scripts and plans to facilitate the deployment of your code to testing and production environments. · Take ownership of any issues that occur within your area to ensure an appropriate solution is found. Assess opportunities for application and process improvement and share with team members and/or affected parties. Company Benefits: Competitive Salary Work from Home Pension Medical Insurance Cab facility for Women Dedicated Gaming Area
Posted 2 weeks ago
5.0 - 8.0 years
0 Lacs
India
On-site
Bloomreach is building the world’s premier agentic platform for personalization .We’re revolutionizing how businesses connect with their customers, building and deploying AI agents to personalize the entire customer journey. We're taking autonomous search mainstream, making product discovery more intuitive and conversational for customers, and more profitable for businesses. We’re making conversational shopping a reality, connecting every shopper with tailored guidance and product expertise — available on demand, at every touchpoint in their journey. We're designing the future of autonomous marketing, taking the work out of workflows, and reclaiming the creative, strategic, and customer-first work marketers were always meant to do. And we're building all of that on the intelligence of a single AI engine — Loomi AI — so that personalization isn't only autonomous…it's also consistent.From retail to financial services, hospitality to gaming, businesses use Bloomreach to drive higher growth and lasting loyalty. We power personalization for more than 1,400 global brands, including American Eagle, Sonepar, and Pandora. We are seeking a Senior AI Engineer to join our dynamic team. In this role, you will be instrumental in building data-driven ML/AI algorithms that enhance our search and recommendation systems. Your primary focus will be on data engineering, analysis, transformations, model training, and serving, ensuring practical and scalable applications of machine learning within our products. This position emphasizes productization and the implementation of ML/AI solutions over pure data science and research, making it ideal for professionals thriving in the fast-paced generative AI era. Key Responsibilities Data Engineering & Analysis Slice and dice analytics data to formulate hypotheses and generate ideas to improve search and recommendation performance. Perform comprehensive data transformations to prepare datasets for model training and evaluation. Build and maintain data pipelines using tools like Airflow, Kubeflow, and MLflow to support ML/AI workflows. Model Development & Deployment Design, develop, and enhance machine learning and AI models tailored to product discovery and search functionalities. Conduct feature engineering to extract meaningful insights from historical data, search queries, product catalogs, and images. Collaborate with Data Engineers to integrate and scale ML components to production-level systems capable of handling large-scale data. Ensure seamless deployment of models, maintaining high availability and performance in cloud environments. Algorithm Implementation & Optimization Dive deep into algorithm applicability, performing impact analysis to ensure models meet performance and business objectives. Optimize and build new algorithms to address various challenges in product discovery and search. Productization of ML/AI Solutions Translate data-driven insights and models into actionable product features that enhance user experience. Work closely with Data Science, Product and Engineering teams to implement practical ML/AI applications that drive business outcomes. Continuous Learning & Improvement Stay abreast of the latest advancements in ML/AI, particularly in generative AI and large language models (LLMs). Continuously refine and improve existing models and workflows based on new research and industry trends. Qualifications Educational Background BS/MS degree in Computer Science, Engineering, Mathematics, or a related discipline with a strong mathematical foundation. Experience 5-8 years of experience building ML-driven, fast, and scalable ML/AI algorithms in a corporate or startup environment. Technical Skills Proficient in Python with excellent programming skills. Strong understanding of machine learning and natural language processing technologies, including classification, information retrieval, clustering, knowledge graphs, semi-supervised learning, and ranking. Experience with deep learning frameworks such as PyTorch, Keras, or TensorFlow. Proficient in SQL and experience with data warehouses like Redshift or BigQuery. Experience with big data technologies such as Hadoop, Spark, Kafka, and data lakes for large-scale processing. Strong understanding of data structures, algorithms, and system design for building highly available, high-performance systems. Experience with workflow orchestration and ML pipeline tools such as Airflow, Kubeflow, and MLflow. Specialized Knowledge Strong awareness of recent trends in Generative AI and Large Language Models (LLMs). Experience working with the GenAI stack is highly desirable. Soft Skills Excellent problem-solving and analytical skills with the ability to adapt to new ML technologies. Effective communication skills in English, both verbal and written. Ability to work collaboratively in a fast-paced, agile environment. More things you'll like about Bloomreach: Culture: A great deal of freedom and trust. At Bloomreach we don’t clock in and out, and we have neither corporate rules nor long approval processes. This freedom goes hand in hand with responsibility. We are interested in results from day one. We have defined our 5 values and the 10 underlying key behaviors that we strongly believe in. We can only succeed if everyone lives these behaviors day to day. We've embedded them in our processes like recruitment, onboarding, feedback, personal development, performance review and internal communication. We believe in flexible working hours to accommodate your working style. We work virtual-first with several Bloomreach Hubs available across three continents. We organize company events to experience the global spirit of the company and get excited about what's ahead. We encourage and support our employees to engage in volunteering activities - every Bloomreacher can take 5 paid days off to volunteer*. The Bloomreach Glassdoor page elaborates on our stellar 4.4/5 rating. The Bloomreach Comparably page Culture score is even higher at 4.9/5 Personal Development: We have a People Development Program -- participating in personal development workshops on various topics run by experts from inside the company. We are continuously developing & updating competency maps for select functions. Our resident communication coach Ivo Večeřa is available to help navigate work-related communications & decision-making challenges.* Our managers are strongly encouraged to participate in the Leader Development Program to develop in the areas we consider essential for any leader. The program includes regular comprehensive feedback, consultations with a coach and follow-up check-ins. Bloomreachers utilize the $1,500 professional education budget on an annual basis to purchase education products (books, courses, certifications, etc.)* Well-being: The Employee Assistance Program -- with counselors -- is available for non-work-related challenges.* Subscription to Calm - sleep and meditation app.* We organize ‘DisConnect’ days where Bloomreachers globally enjoy one additional day off each quarter, allowing us to unwind together and focus on activities away from the screen with our loved ones. We facilitate sports, yoga, and meditation opportunities for each other. Extended parental leave up to 26 calendar weeks for Primary Caregivers.* Compensation: Restricted Stock Units or Stock Options are granted depending on a team member’s role, seniority, and location.* Everyone gets to participate in the company's success through the company performance bonus.* We offer an employee referral bonus of up to $3,000 paid out immediately after the new hire starts. We reward & celebrate work anniversaries -- Bloomversaries!* (*Subject to employment type. Interns are exempt from marked benefits, usually for the first 6 months.) Excited? Join us and transform the future of commerce experiences! If this position doesn't suit you, but you know someone who might be a great fit, share it - we will be very grateful! Any unsolicited resumes/candidate profiles submitted through our website or to personal email accounts of employees of Bloomreach are considered property of Bloomreach and are not subject to payment of agency fees.
Posted 2 weeks ago
5.0 years
0 Lacs
India
On-site
We’re on the lookout for a skilled and motivated Data Engineer to join our growing tech team. If you’re passionate about building robust data pipelines, optimizing data workflows, and enabling smart data-driven decisions — we’d love to connect with you! Key Responsibilities: Design, build, and maintain scalable ETL/ELT pipelines Integrate data from multiple sources into centralized data stores Work closely with Data Analysts and Scientists to support analytical needs Optimize data delivery for performance and reliability Ensure data integrity, quality, and compliance Preferred Skills & Experience: 2–5 years of experience in Data Engineering Strong knowledge of SQL, Python, Spark/PySpark Experience with data warehousing (e.g., Snowflake, Redshift, BigQuery) Hands-on with ETL tools, data pipelines, and APIs Familiarity with cloud platforms (Azure, AWS, or GCP)
Posted 2 weeks ago
5.0 years
20 Lacs
Chandigarh
On-site
About the Role We are seeking a highly experienced and hands-on Fullstack Architect to lead the design and architecture of scalable, enterprise-grade software solutions. This role requires a deep understanding of both frontend and backend technologies, cloud infrastructure, and microservices, with the ability to guide teams through technical challenges and solution delivery. Key Responsibilities Architect, design, and oversee the development of full-stack applications using modern JS frameworks and cloud-native tools. Lead microservice architecture design, ensuring system scalability, reliability, and performance. Evaluate and implement AWS services (Lambda, ECS, Glue, Aurora, API Gateway, etc.) for backend solutions. Provide technical leadership to engineering teams across all layers (frontend, backend, database). Guide and review code, perform performance optimization, and define coding standards. Collaborate with DevOps and Data teams to integrate services (Redshift, OpenSearch, Batch). Translate business needs into technical solutions and communicate with cross-functional stakeholders. Required Skills Deep expertise in Node.js , TypeScript , React.js , Python , Redux , and Jest . Proven experience designing and deploying systems using Microservices architecture . Strong understanding of AWS services: API Gateway, ECS, Lambda, Aurora, Glue, SQS, OpenSearch, Batch. Hands-on with MySQL , Redshift , and writing optimized queries. Advanced knowledge of HTML, CSS, Bootstrap, JavaScript . Familiarity with tools: VS Code , DataGrip , Jira , GitHub , Postman . Strong knowledge of architectural design patterns and security best practices. Job Types: Full-time, Permanent Pay: From ₹2,055,277.41 per year Benefits: Health insurance Leave encashment Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Education: Bachelor's (Preferred) Experience: Full-stack development: 5 years (Required) Location: Chandigarh, Chandigarh (Required) Shift availability: Day Shift (Required) Work Location: In person
Posted 2 weeks ago
7.0 - 12.0 years
5 - 7 Lacs
Hyderābād
Remote
Job Information Date Opened 07/08/2025 Job Type Full time Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500059 About Us About DATAECONOMY: We are a fast-growing data & analytics company headquartered in Dublin with offices inDublin, OH, Providence, RI, and an advanced technology center in Hyderabad,India. We are clearly differentiated in the data & analytics space via our suite of solutions, accelerators, frameworks, and thought leadership. Job Description We are seeking a highly experienced and hands-on Lead/ Senior Data Engineer to architect, develop, and optimize data solutions in a cloud-native environment. The ideal candidate will have 7–12 years of strong technical expertise in AWS Glue, PySpark, and Python , along with experience designing robust data pipelines and frameworks for large-scale enterprise systems. Prior exposure to the financial domain or regulated environments is a strong advantage. Key Responsibilities: Solution Architecture : Design scalable and secure data pipelines using AWS Glue, PySpark, and related AWS services (EMR, S3, Lambda, etc.) Leadership & Mentorship : Guide junior engineers, conduct code reviews, and enforce best practices in development and deployment. ETL Development : Lead the design and implementation of end-to-end ETL processes for structured and semi-structured data. Framework Building : Develop and evolve data frameworks, reusable components, and automation tools to improve engineering productivity. Performance Optimization : Optimize large-scale data workflows for performance, cost, and reliability. Data Governance : Implement data quality, lineage, and governance strategies in compliance with enterprise standards. Collaboration : Work closely with product, analytics, compliance, and DevOps teams to deliver high-quality solutions aligned with business goals. CI/CD Automation : Set up and manage continuous integration and deployment pipelines using AWS CodePipeline, Jenkins, or GitLab. Documentation & Presentations : Prepare technical documentation and present architectural solutions to stakeholders across levels. Requirements Required Qualifications: 7–12 years of experience in data engineering or related fields. Strong expertise in Python programming with a focus on data processing. Extensive experience with AWS Glue (both Glue Jobs and Glue Studio/Notebooks). Deep hands-on experience with PySpark for distributed data processing. Solid AWS knowledge : EMR, S3, Lambda, IAM, Athena, CloudWatch, Redshift, etc. Proven experience in architecture and managing complex ETL workflows . Proficiency with Apache Airflow or similar orchestration tools. Hands-on experience with CI/CD pipelines and DevOps best practices. Familiarity with data quality , data lineage , and metadata management . Strong experience working in agile/scrum teams. Excellent communication and stakeholder engagement skills. Preferred/Good to Have: Experience in financial services, capital markets, or compliance systems . Knowledge of data modeling , data lakes , and data warehouse architecture . Familiarity with SQL (Athena/Presto/Redshift Spectrum). Exposure to ML pipeline integration or event-driven architecture is a plus. Benefits Flexible work culture and remote options Opportunity to lead cutting-edge cloud data engineering projects Skill-building in large-scale, regulated environments.
Posted 2 weeks ago
0 years
6 - 8 Lacs
Hyderābād
On-site
Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Consultant –AWS! Responsibilities Design and deploy scalable, highly available , and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience of working with Oracle ERP Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Master's / Equivalent Job Posting Jul 7, 2025, 7:24:46 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 2 weeks ago
5.0 years
6 - 9 Lacs
Hyderābād
On-site
DevSecOps Engineer – CL4 Role Overview : As a DevSecOps Engineer , you will actively engage in your engineering craft, taking a hands-on approach to multiple high-visibility projects. Your expertise will be pivotal in delivering solutions that delight customers and users, while also driving tangible value for Deloitte's business investments. You will leverage your extensive DevSecOps engineering craftsmanship and advanced proficiency across multiple programming languages, DevSecOps tools, and modern frameworks, consistently demonstrating your strong track record in delivering high-quality, outcome-focused CI/CD and automation solutions. The ideal candidate will be a dependable team player, collaborating with cross-functional teams to design, develop, and deploy advanced software solutions. Key Responsibilities : Outcome-Driven Accountability: Embrace and drive a culture of accountability for customer and business outcomes. Develop DevSecOps engineering solutions that solve complex automation problems with valuable outcomes, ensuring high-quality, lean, resilient and secure pipelines with low operating costs, meeting platform/technology KPIs. Technical Leadership and Advocacy: Serve as the technical advocate for DevSecOps modern practices, ensuring integrity, feasibility, and alignment with business and customer goals, NFRs, and applicable automation/integration/security practices—being responsible for designing and maintaining code repos, CI/CD pipelines, integrations (code quality, QE automation, security, etc.) and environments (sandboxes, dev, test, stage, production) through IaC, both for custom and package solutions, including identifying, assessing, and remediating vulnerabilities. Engineering Craftsmanship: Maintain accountability for the integrity and design of DevSecOps pipelines and environments while leading the implementation of deployment techniques like Blue-Green, Canary to minimize down-time and enable A/B testing. Be always hands-on and actively engage with engineers to ensure DevSecOps practices are understood and can be implemented throughout the product development life cycle. Resolve any technical issues from implementation to production operations (e.g., leading triage and troubleshooting production issues). Be self-driven to learn new technologies, experiment with engineers, and inspire the team to learn and drive application of those new technologies. Customer-Centric Engineering: Develop lean, and yet scalable and flexible, DevSecOps automations through rapid, inexpensive experimentation to solve customer needs, enabling version control, security, logging, feedback loops, continuous delivery, etc. Engage with customers and product teams to deliver the right automation, security, and deployment practices. Incremental and Iterative Delivery: Adopt a mindset that favors action and evidence over extensive planning. Utilize a leaning-forward approach to navigate complexity and uncertainty, delivering lean, supportable, and maintainable solutions. Cross-Functional Collaboration and Integration: Work collaboratively with empowered, cross-functional teams including product management, experience, engineering, delivery, infrastructure, and security. Integrate diverse perspectives to make well-informed decisions that balance feasibility, viability, usability, and value. Support a collaborative environment that enhances team synergy and innovation. Advanced Technical Proficiency: Possess intermediary knowledge in modern software engineering practices and principles, including Agile methodologies, DevSecOps, Continuous Integration/Continuous Deployment. Strive to be a role model, leveraging these techniques to optimize solutioning and product delivery, ensuring high-quality outcomes with minimal waste. Demonstrate intermediate level understanding of the product development lifecycle, from conceptualization and design to implementation and scaling, with a focus on continuous improvement and learning. Domain Expertise: Quickly acquire domain-specific knowledge relevant to the business or product. Translate business/user needs into technical requirements and automations. Learn to navigate various enterprise functions such as product, experience, engineering, compliance, and security to drive product value and feasibility. Effective Communication and Influence: Exhibit exceptional communication skills, capable of articulating technical concepts clearly and compellingly. Support teammates and product teams through well-structured arguments and trade-offs supported by evidence, evaluations, and research. Learn to create a coherent narrative that align technical solutions with business objectives. Engagement and Collaborative Co-Creation: Able to engage and collaborate with product engineering teams, including customers as needed. Able to build and maintain constructive relationships, fostering a culture of co-creation and shared momentum towards achieving product goals. Support diverse perspectives and consensus to create feasible solutions. The team : US Deloitte Technology Product Engineering has modernized software and product delivery, creating a scalable, cost-effective model that focuses on value/outcomes by leveraging a progressive and responsive talent structure. As Deloitte’s primary internal development team, Product Engineering delivers innovative digital solutions to businesses, service lines, and internal operations with proven bottom-line results and outcomes. It helps power Deloitte’s success. It is the engine that drives Deloitte, serving many of the world’s largest, most respected companies. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. Key Qualifications : § A bachelor’s degree in computer science, software engineering, or a related discipline. An advanced degree (e.g., MS) is preferred but not required. Experience is the most relevant factor. § Strong software engineering foundation with deep understanding of OOP/OOD, functional programming, data structures and algorithms, software design patterns, code instrumentations, etc. § 5+ years proven experience with Python, Bash, PowerShell, JavaScript, C#, and Golang (preferred). § 5+ years proven experience with CI/CD tools (Azure DevOps and GitHub Enterprise) and Git (version control, branching, merging, handling pull requests) to automate build, test, and deployment processes. § 5+ years of hands-on experience in security tools automation SAST/DAST (SonarQube, Fortify, Mend), monitoring/logging (Prometheus, Grafana, Dynatrace), and other cloud-native tools on AWS, Azure, and GCP. § 5+ years of hands-on experience in using Infrastructure as Code (IaC) technologies like Terraform, Puppet, Azure Resource Manager (ARM), AWS Cloud Formation, and Google Cloud Deployment Manager. § 2+ years of hands-on experience with cloud native services like Data Lakes, CDN, API Gateways, Managed PaaS, Security, etc. on multiple cloud providers like AWS, Azure and GCP is preferred. § Strong understanding of methodologies like, XP, Lean, SAFe to deliver high quality products rapidly. § General understanding of cloud providers security practices, database technologies and maintenance (e.g. RDS, DynamoDB, Redshift, Aurora, Azure SQL, Google Cloud SQL) § General knowledge of networking, firewalls, and load balancers. § Strong preference will be given to candidates with AI/ML and GenAI. § Excellent interpersonal and organizational skills, with the ability to handle diverse situations, complex projects, and changing priorities, behaving with passion, empathy, and care. How You will Grow: At Deloitte, our professional development plans focus on helping people at every level of their career to identify and use their strengths to do their best work every day and excel in everything they do. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300653
Posted 2 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description At Amazon, we strive to be the most innovative and customer centric company on the planet. Come work with us to develop innovative products, tools and research driven solutions in a fast-paced environment by collaborating with smart and passionate leaders, program managers and software developers. This role is based out of our Bangalore corporate office and is for an passionate, dynamic, analytical, innovative, hands-on, and customer-centric Business analyst. Key job responsibilities This role primarily focuses on deep-dives, creating dashboards for the business, working with different teams to develop and track metrics and bridges. Design, develop and maintain scalable, automated, user-friendly systems, reports, dashboards, etc. that will support our analytical and business needs In-depth research of drivers of the Localization business Analyze key metrics to uncover trends and root causes of issues Suggest and build new metrics and analysis that enable better perspective on business Capture the right metrics to influence stakeholders and measure success Develop domain expertise and apply to operational problems to find solution Work across teams with different stakeholders to prioritize and deliver data and reporting Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation Basic Qualifications 5+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience using Advanced SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A3009497
Posted 2 weeks ago
0 years
0 Lacs
India
Remote
Role We are looking for a Test Engineer who will become part of our team building and testing the Creditsafe data. You will be working closely with the database teams and data engineering to build specific systems facilitating the extraction and transformation of Creditsafe data. Based on the test strategy and approach you will develop, enhance and execute tests that add value to Creditsafe data. You will act as a primary source of guidance to Junior Test Engineers and Test Engineers in all areas of data quality. You will contribute to the team using data quality best practices and techniques. You can confidently communicate test results with your team members and stakeholders using evidence and reports. You act as a mentor and coach to the less experienced members of the test team. You will promote and coach leading practices in data test management, design, and implementation. You will be part of an Agile team and will effectively contribute to the ceremonies, acting as the quality specialist within that team. You are an influencer and will provide leadership in defining and implementing agreed standards and will actively promote this within your team and the wider development community. The ideal candidate has extensive experience in mentorship and leading by example and is able to communicate values consistent with the Creditsafe philosophy of engagement. You have critical thinking skills and can diplomatically communicate within, and outside their areas of responsibility, challenging assumptions where required. Required Skills: Proven working experience as a data test engineer or business data analyst or ETL tester. Technical expertise regarding data models, database design development, data mining and segmentation techniques Strong knowledge of and experience with SQL databases Hands on experience of best engineering practices (handling and logging errors, system monitoring and building human-fault-tolerant applications) Knowledge of statistics and experience using statistical packages for analysing datasets (Excel, SPSS, SAS etc.) is an advantage. Comfortable working with relational databases such as Redshift, Oracle, PostgreSQL, MySQL, and MariaDB (PostgreSQL preferred) Strong analytical skills with the ability to collect, organise, analyse, and disseminate significant amounts of information with attention to detail and accuracy Adept at queries, report writing and presenting findings. BS in Mathematics, Economics, Computer Science, Information Management or Statistics is desirable but not essential A good understanding of cloud technology, preferably AWS and/or Azure DevOps A practical understanding of programming: JavaScript, Python Excellent communication skills Practical experience of testing in an Agile approach Desirable Skills An understanding of version control systems Practical experience of conducting code reviews Practical experience of pair testing and pair programming Primary Responsibilities Reports to Engineering Lead Work as part of the engineering team in data acquisition Designing and implementing processes and tools to monitor and improve the quality of Creditsafe's data. Developing and executing test plans to verify the accuracy and reliability of data. Working with data analysts and other stakeholders to establish and maintain data governance policies. Identifying and resolving issues with the data, such as errors, inconsistencies, or duplication. Collaborating with other teams, such as data analysts and data scientists to ensure the quality of data used for various projects and initiatives. Providing training and guidance to other team members on data quality best practices and techniques. Monitoring and reporting on key data quality metrics, such as data completeness and accuracy. Continuously improving data quality processes and tools based on feedback and analysis. Work closely with their Agile team to promote a whole team approach to quality Documents approaches and processes that improve the quality effort for use byteam members and the wider test function Strong practical knowledge of software testing techniques and the ability to advise on, and select, the correct technique dependent on the problem at hand Conducts analysis of the teams test approach, taking a proactive role in the formulation of the relevant quality criteria in line with the team goals Work with team members to define standards and processes applicable to their area of responsibility Monitor progress of team deliverables, injecting quality concerns in a timely, effective manner Gain a sufficient understanding of the system architecture to inform their test approach and that of the test engineers Creation and maintenance of concise and accurate defect reports in line with the established defect process Job Types: Full-time, Permanent Benefits: Flexible schedule Health insurance Provident Fund Work from home Schedule: Monday to Friday Supplemental Pay: Performance bonus Work Location: In person Speak with the employer +91 9121185668
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi