Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
11 - 15 Lacs
Gurugram, India
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly changing world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make an outstanding addition to our vibrant team. Siemens Mobility is an independent run company of Siemens AG. Its core business includes rail vehicles, rail automation and electrification solutions, turnkey systems, intelligent road traffic technology and related services. In Mobility, we help our customers meet the need for hard-working mobility solutions. We’re making the lives of people who travel easier and more enjoyable while constantly developing new, intelligent mobility solutions! We are looking for Project Manager You’ll make a difference by Contributes for definition of the target-costs and cost-controlling in coordination with the interface-partnersDefines project-specific targets (cost, time, quality) via Technical Manager (TM) and Project Procurement Manager (PPM) for the Product Solutions and Creation (PSC)-Teams (under consideration the overall targets and if required the overall DTC-Measures and their Implementation). Implementation of the customer requirements in matters of time, cost and quality based on the technical solutions from the PSC-Teams. Descend after escalation from the PSC-Teams in coordination with the Sub project manager and - if necessary with the other core-team-members - as soon asthe targets of the project cannot be met or frame-condition from the projects are possibly exceeded (time, quality, costs, customer targets / requirements) or an entrepreneurial decision is needed (a decision with uncertainness and with an possibly impact to the targets) Clarifies as a person responsible for decisions in all aspects with the customer (particularly all contract and claim issues. Informs immediately the PSC-Team members if the frame-conditions are changed. It is part of the PSC-Team meetings, when agreed with the PSC-Team-members. Responsible for matters of DTC-measures with strong coordination with the PSC-Team. Responsible for coordination with the TM about scope-shifts between the PSC-Teams. Customer approvalHe coordinates the correct handover and approval to the end customer (including e.g. acceptance tests and documentation) and supports the warranty phase. Have facts and figures up to date to have a transparent overview about the progress and target achievements and reports to the Project Manager within the internal Core-Team-Meetings. Transfer all issues to his successor or to the Sub project manager in a transparent way when he leaves the project. Coordinate all supplier activities in coordination with the PSC-Team-members (e.g. customer visits on the supplier-sites). Coordinate material by customer. Coordinate the MO-internal supply. Desired Skills: Bachelor's degree in a relevant field along with a minimum of 8-12 years of experience in project management Responsible for making decisions and issuing instructions for all aspects relating to timescales and project handling for the associated modules, components and systems to ensure compliance with the project requirements. In the event of conflicts of interest with line management, he can escalate the case to the PD in consultation with the Sub project manager. Represents the Sub project manager in sourcing decisions and attends PSC team status meetings. Detail-oriented with a high level of accuracy Ability to work effectively in a team environment Join us and be yourself! Make your mark in our exciting world at Siemens. This role is based in Gurgaon. You might be required to visit other locations within India and outside. In return, you'll get the chance to work with teams impacting - and the shape of things to come. Find out more about mobility at
Posted 2 weeks ago
5.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Work from Office
As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about Target in India At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. About the Role As a Senior RBX Data Specialist at Target in India, involves the end-to-end management of data, encompassing building and maintaining pipelines through ETL/ELT and data modeling, ensuring data accuracy and system performance, and resolving data flow issues. It also requires analyzing data to generate insights, creating visualizations for stakeholders, automating processes for efficiency, and effective collaboration across both business and technical teams. You will also answer ad-hoc questions from your business users by conducting quick analysis on relevant data, identify trends and correlations, and form hypotheses to explain the observations. Some of this will lead to bigger projects of increased complexity, where you will have to work as a part of a bigger team, but also independently execute specific tasks. Finally, you are expected to always adhere to project schedule and technical rigor as well as requirements for documentation, code versioning, etc Key Responsibilities Data Pipeline and MaintenanceMonitor data pipelines and warehousing systems to ensure optimal health and performance. Ensure data integrity and accuracy throughout the data lifecycle. Incident Management and ResolutionDrive the resolution of data incidents and document their causes and fixes, collaborating with teams to prevent recurrence. Automation and Process ImprovementIdentify and implement automation opportunities and Data Ops best practices to enhance the efficiency, reliability, and scalability of data processes. Collaboration and CommunicationWork closely with data teams and stakeholders, to understand data pipeline architecture and dependencies, ensuring timely and accurate data delivery while effectively communicating data issues and participating in relevant discussions. Data Quality and GovernanceImplement and enforce data quality standards, monitor metrics for improvement, and support data governance by ensuring policy compliance. Documentation and ReportingCreate and maintain clear and concise documentation of data pipelines, processes, and troubleshooting steps. Develop and generate reports on data operations performance and key metrics. Core responsibilities are described within this job description. Job duties may change at any time due to business needs. About You B.Tech / B.E. or equivalent (completed) degree 5+ years of relevant work experience Experience in Marketing/Customer/Loyalty/Retail analytics is preferable Exposure to A/B testing Familiarity with big data technologies, data languages and visualization tools Exposure to languages such as Python and R for data analysis and modelling Proficiency in SQL for data extraction, manipulation, and analysis, with experience in big data query frameworks such as Hive, Presto, SQL, or BigQuery Solid foundation knowledge in mathematics, statistics, and predictive modelling techniques, including Linear Regression, Logistic Regression, time-series models, and classification techniques. Ability to simplify complex technical and analytical methodologies for easier comprehension for broad audiences. Ability to identify process and tool improvements and implement change Excellent written and verbal English communication skills for Global working Motivation to initiate, build and maintain global partnerships Ability to function in group and/or individual settings. Willing and able to work from our office location (Bangalore HQ) as required by business needs and brand initiatives Useful Links- Life at Target- https://india.target.com/Benefits- https://india.target.com/life-at-target/workplace/benefitsCulture- https://india.target.com/life-at-target/belonging
Posted 2 weeks ago
3.0 - 4.0 years
5 - 7 Lacs
Chandigarh
Work from Office
Key Responsibilities Design, develop, and maintain scalable ETL workflows using Cloud Data Fusion and Apache Airflow . Configure and manage various data connectors (e.g., Cloud Storage, Pub/Sub, JDBC, SaaS APIs) for batch and streaming data ingestion. Implement data transformations, cleansing, and enrichment logic in Python (and SQL) to meet analytic requirements. Optimize BigQuery data models (fact/dimension tables, partitioning, clustering) for performance and cost-efficiency. Monitor, troubleshoot, and tune pipeline performance; implement robust error-handling and alerting mechanisms. Collaborate with data analysts, BI developers, and architects to understand data requirements and deliver accurate datasets. Maintain documentation for data pipelines, schemas, and operational runbooks. Ensure data security and governance best practices are followed across the data lifecycle. Minimum Qualifications 3+ years of hands-on experience in data engineering, with a focus on cloud-native ETL. Proven expertise with Google Cloud Data Fusion , including pipeline authoring and custom plugin development. Solid experience building and orchestrating pipelines in Apache Airflow (DAG design, operators, hooks). Strong Python programming skills for data manipulation and automation. Deep understanding of BigQuery : schema design, SQL scripting, performance tuning, and cost management. Familiarity with additional GCP services: Cloud Storage, Pub/Sub, Dataflow, and IAM. Experience with version control (Git), CI/CD pipelines, and DevOps practices for data projects. Excellent problem-solving skills, attention to detail, and the ability to work independently in a fast-paced environment. Immediate availability to join. Preferred (Nice-to-Have) Experience with other data integration tools (e.g., Dataflow, Talend, Informatica). Knowledge of containerization (Docker, Kubernetes) for scalable data workloads. Familiarity with streaming frameworks (Apache Beam, Spark Streaming). Background in data modeling methodologies (Star/Snowflake schemas). Exposure to metadata management, data cataloguing, and data governance frameworks.
Posted 2 weeks ago
8.0 - 12.0 years
20 - 35 Lacs
Hyderabad, Bengaluru
Hybrid
Essential Responsibilities: Architecture & Design Define and document the overall data platform architecture in GCP, including ingestion (Pub/Sub, Dataflow), storage (BigQuery, Cloud Storage), and orchestration (Composer, Workflows). Establish data modeling standards (star/snowflake schemas, partitioning, clustering) to optimize performance and cost. Platform Implementation Build scalable, automated ETL/ELT pipelines for IoT telemetry and events. Implement streaming analytics and CDC where required to support real-time dashboards and alerts. Data Products & Exchange Collaborate with data scientists and product managers to package curated datasets and ML feature tables as consumable data products. Architect and enforce a secure, governed data exchange layerleveraging BigQuery Authorized Views, Data Catalog, and IAM—to monetize data externally. Cost Management & Optimization Design cost-control measures: table partitioning/clustering, query cost monitoring, budget alerts, and committed-use discounts. Continuously analyze query performance and storage utilization to drive down TCO. Governance & Security Define and enforce data governance policies (cataloging, lineage, access controls) using Cloud Data Catalog and Cloud IAM. Ensure compliance with privacy, security, and regulatory requirements for internal and external data sharing. Stakeholder Enablement Partner with business stakeholders to understand data needs and translate them into platform capabilities and SLAs. Provide documentation, training, and self-service tooling (Data Studio templates, APIs, notebooks) to democratize data access. Mentorship & Leadership Coach and mentor engineers on big data best practices, SQL optimization, and cloud-native architecture patterns. Lead architecture reviews, proof-of-concepts, and pilot projects to evaluate emerging technologies (e.g., BigQuery Omni, Vertex AI). Additional Job DescriptionAdditional Job DescriptionMinimum Qualifications Bachelor’s degree in Computer Science, Engineering, or related field. 8+ years designing and operating large-scale data platforms, with at least 5 years hands-on experience in GCP (BigQuery, Dataflow, Pub/Sub). Deep expertise in BigQuery performance tuning, data partitioning/clustering, and cost-control techniques. Proven track record building streaming and batch pipelines (Apache Beam, Dataflow, Spark). Strong SQL skills and experience with data modeling for analytics. Familiarity with data governance tools: Data Catalog, IAM, VPC Service Controls. Experience with Python or Java for ETL/ELT development. Excellent communication skills, able to translate technical solutions for non-technical stakeholders.
Posted 2 weeks ago
7.0 - 8.0 years
15 - 25 Lacs
Bengaluru
Work from Office
Role & responsibilities Oversee the design, implementation, and optimization of data warehousing solutions leveraging tools like Snowflake , Databricks , and other cloud data platforms. Lead the delivery of software projects from initiation through implementation. Lead the delivery of ETL processes for ingesting, transforming, and managing large-scale datasets. Lead the delivery of Data Analytics dashboards and reports on modern data stacks Develop project plans , allocate resources, and track progress using project management tools such as Jira, Asana, Trello, or MS Project . Act as the primary point of contact for clients, building strong relationships, providing regular updates, and addressing concerns promptly. Manage risks and resolve project roadblocks to ensure timely delivery of high-quality solutions. Ensure projects align with data governance best practices, security protocols, and client standards. Provide technical guidance to the development team, ensuring high-quality and timely delivery. Work with stakeholders to define KPIs and ensure delivery meets the business and technical goals. Drive continuous improvement initiatives in delivery processes, data quality, and team efficiency. Provide leadership and mentoring to project teams, fostering a culture of collaboration, accountability, and excellence. Preferred candidate profile Bachelors degree in computer science, Information Systems, Data Engineering, or a related field. 5+ years of experience managing the delivery of Data Warehouse , data engineering and data analytics projects . Strong experience with cloud-based data platforms such as Snowflake , Databricks , or Amazon Redshift. Proficiency in managing ETL pipelines and understanding data transformation processes . Solid knowledge of data warehousing concepts (e.g., dimensional modelling, star/snowflake schema, OLAP/OLTP systems). Experience working with SQL for data querying, performance optimization, and testing. Proven ability to manage multiple stakeholders, prioritize tasks, and ensure client satisfaction. Proficiency with project management tools: Jira, Asana, Trello, or MS Project . Familiarity with Agile, Scrum, and Waterfall methodologies.
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Google Alloy DB. Experience: 5-8 Years.
Posted 2 weeks ago
3.0 - 4.0 years
3 - 7 Lacs
Mumbai
Work from Office
Job Summary We are seeking an experienced and motivated Data Engineer to join our growing team, preferably with experience in the Banking, Financial Services, and Insurance (BFSI) sector. The ideal candidate will have a strong background in designing, building, and maintaining robust and scalable data infrastructure. You will play a crucial role in developing our data ecosystem, ensuring data quality, and empowering data-driven decisions across the organization. This role requires hands-on experience with the Google Cloud Platform (GCP) and a passion for working with cutting-edge data technologies. Responsibilities Design and Develop End-to-End Data Engineering Pipelines: Build, and maintain scalable and reliable data pipelines to ingest, process, and transform large volumes of structured and unstructured data from various sources. Implement Data Quality and Governance: Establish and enforce processes for data validation, transformation, auditing, and reconciliation to ensure data accuracy, completeness, and consistency. Build and Maintain Data Storage Solutions: Design, implement, and manage data vault and data mart to support business intelligence, analytics, and reporting requirements. Orchestrate and Automate Workflows: Utilize workflow management tools to schedule, monitor, and automate complex data workflows and ETL processes. Optimize Data Infrastructure: Continuously evaluate and improve the performance, reliability, and cost-effectiveness of our data infrastructure and pipelines. Collaborate with Stakeholders: Work closely with data analysts, data scientists, and business stakeholders to understand their data needs and deliver effective data solutions. Documentation: Create and maintain comprehensive documentation for data pipelines, processes, and architectures. Key Skills Python: Proficient in Python for data engineering tasks, including scripting, automation, and data manipulation. PySpark: Strong experience with PySpark for large-scale data processing and analytics. SQL: Expertise in writing complex SQL queries for data extraction, transformation, and analysis. Tech Stack (Must Have) Google Cloud Platform (GCP): Dataproc: For managing and running Apache Spark and Hadoop clusters. Composer (Airflow): For creating, scheduling, and monitoring data workflows. Cloud Functions: For event-driven serverless data processing. Cloud Run: For deploying and scaling containerized data applications. Cloud SQL: For managing relational databases. BigQuery: For data warehousing, analytics, and large-scale SQL queries. Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. 3+ years of proven experience in a Data Engineer role. Demonstrable experience with the specified "must-have" tech stack. Strong problem-solving skills and the ability to work independently and as part of a team. Excellent communication and interpersonal skills. Good to Have Experience in the BFSI (Banking, Financial Services, and Insurance) domain. Apache NiFi: Experience with data flow automation and management. Qlik: Familiarity with business intelligence and data visualization tools. AWS: Knowledge of Amazon Web Services data services. DevOps and FinOps: Understanding of DevOps principles and practices (CI/CD, IaC) and cloud financial management (FinOps) to optimize cloud spending.
Posted 2 weeks ago
5.0 - 10.0 years
17 - 20 Lacs
Mumbai
Work from Office
KEY RESPONSIBILITIES Should be able to multitask and manage various projects Should be able to evaluate business processes, anticipate requirements, uncover areas for improvement, and developing and implementing solutions. Should be able to communicate the documented business requirements with the technical team. Reviewing FSD prepared by IT and ensuring all client requirements are met Should be able to prepare and review manual test scenarios and test cases Should be able to liase with IT vendors for their project deliveries/bugs etc. Leading UAT team for project closure ensuring that complete functional and regression testing is done before go live Ensuring smooth deployment and monitoring in production end ensuring post production signoff Maintaining project tracker INTERACTIONS Internal Relations New Business Team, Other Projects Team, IT Team External Relations IT vendors such as CTS, Hansa, Posidex REQUIRED QUALIFICATION AND SKILLS Educational Qualifications Graduate in Science IT Postgraduate in Science/ IT/ MBA Work Experience Should have atleast 5 years + work experience as project management preferably in the BFSI domain (Experience in Insurance domain would be added advantage) Certifications Testing certificate Other skill set Team management capability Should be aggressive and take responsibility of completing multiple projects within defined timelines Should have good oral and written communication skills Should be able to analyse all outcomes of a situation and take steps accordingly
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Google BigQuery Experience: 5-8 Years
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Mandatory Skills: Google BigQuery Experience: 5-8 Years
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Chennai
Work from Office
Hands-on experience in data modelling for both OLTP and OLAP systems. In-depth knowledge of Conceptual, Logical, and Physical data modelling. Strong understanding of indexing, partitioning, and data sharding with practical experience. Experience in identifying and addressing factors affecting database performance for near-real-time reporting and application interaction. Proficiency with at least one data modelling tool (preferably DB Schema). Functional knowledge of the mutual fund industry is a plus. Familiarity with GCP databases like Alloy DB, Cloud SQL, and Big Query. Willingness to work from Chennai (office presence is mandatory) Chennai customer site, requiring five days of on-site work each week. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform Experience: 5-8 Years
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
vadodara, gujarat
On-site
You will be working with Polymer, a smart data loss prevention (DLP) system that offers advanced cloud & AI data security and compliance solutions. By leveraging Polymer, you will play a crucial role in automating data protection processes, reducing data exposure risks, and enabling employees to enhance data security practices seamlessly within their existing workflows. Your responsibilities will include designing, developing, and maintaining ETL processes within large-scale data environments utilizing tools such as Snowflake and BigQuery. You will be tasked with constructing and deploying data pipelines to manage data ingestion, transformation, and loading operations from diverse sources. Additionally, you will create and manage data models and schemas optimized for performance and scalability, leveraging BI tools like QuickSight, Tableau, or Sigma to generate interactive dashboards and reports. Collaboration with stakeholders to grasp business requirements and convert them into technical solutions will be a key aspect of your role. You will communicate complex data insights clearly to both technical and non-technical audiences, proactively identify and resolve data quality issues and performance bottlenecks, and contribute to enhancing the data infrastructure and best practices within the organization. As a qualified candidate, you should hold a Bachelor's or Master's degree in Computer Science, Data Science, Computer Engineering, or a related field, along with 3-5 years of experience in a data science/engineering role. Proficiency in Python, including experience with Django or Flask, is essential, while expertise in Snowflake and BigQuery is advantageous. Experience with relational databases like MySQL or PostgreSQL, designing ETL processes in large-scale data environments, and working with cloud platforms such as AWS or GCP is highly valued. Your problem-solving and analytical skills, combined with a data-driven mindset, will be crucial in this role. Strong communication, interpersonal skills, and the ability to work both independently and collaboratively within a team are essential attributes. Familiarity with Agile development methodologies will be beneficial for success in this position. This is an onsite opportunity located in Vadodara, Gujarat, India.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Java Developer at our Pune, India location, you will be responsible for producing scalable software solutions on distributed systems like Hadoop using the Spark Framework. You will work within a cross-functional team involved in the full software development life cycle, from conception to deployment. Your role will require expertise in back-end coding, development frameworks, third-party libraries, and Spark APIs essential for application development on distributed platforms like Hadoop. Being a team player with a flair for visual design and utility is crucial, along with familiarity with Agile methodologies. Given that a significant portion of the workloads and applications will be cloud-based, knowledge and experience with Google Cloud Platform (GCP) will be beneficial. Your responsibilities will include collaborating with development teams and product managers to brainstorm software solutions, designing client-side and server-side architecture, building features and applications capable of running on distributed platforms or the cloud, managing well-functioning applications supporting micro-services architecture, testing software for responsiveness and efficiency, troubleshooting, debugging, and upgrading software, establishing security and data protection settings, writing technical and design documentation, and creating effective APIs (REST & SOAP). To be successful in this role, you should have proven experience as a Java Developer or in a similar role, familiarity with common stacks, strong knowledge and working experience of Core Java, Spring Boot, Rest APIs, Spark API, etc. Knowledge of the React framework and UI experience would be advantageous. Proficiency in Junit, Mockito, or other frameworks is necessary, while familiarity with GCP services, design/architecture, and security frameworks is an added advantage. Experience with databases like Oracle, PostgreSQL, and BigQuery, as well as developing on distributed application platforms like Hadoop with Spark, is expected. Excellent communication, teamwork, organizational, and analytical skills are essential, along with a degree in Computer Science, Statistics, or a relevant field and experience working in Agile environments. It would be beneficial to have knowledge of JavaScript frameworks (e.g., Angular, React, Node.js) and UI/UX design, Python, and NoSQL databases like HBASE and MONGO. The ideal candidate should have 4-7 years of prior working experience in a global banking/insurance/financial organization. We offer a supportive environment with training and development opportunities, coaching from experts in the team, a culture of continuous learning, and a range of flexible benefits tailored to suit your needs. If you are looking to excel in your career and contribute to a collaborative and inclusive work environment, we invite you to apply for the Java Developer position at our organization. For further information about our company and teams, please visit our company website at https://www.db.com/company/company.htm. We strive to create a culture where every individual is empowered to excel together, act responsibly, think commercially, take initiative, and work collaboratively. We celebrate the successes of our people and promote a positive, fair, and inclusive work environment. Join us at Deutsche Bank Group and be part of a team where together, we achieve excellence every day.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have a strong understanding of the tech stack including GCP Services such as BigQuery, Cloud Dataflow, Pub/Sub, Dataproc, and Cloud Storage. Experience with Data Processing tools like Apache Beam (batch/stream), Apache Kafka, and Cloud Dataprep is crucial. Proficiency in programming languages like Python, Java/Scala, and SQL is required. Your expertise should extend to Orchestration tools like Apache Airflow (Cloud Composer) and Terraform, and Security aspects including IAM, Cloud Identity, and Cloud Security Command Center. Knowledge of Containerization using Docker and Kubernetes (GKE) is essential. Familiarity with Machine Learning platforms such as Google AI Platform, TensorFlow, and AutoML is expected. Candidates with certifications like Google Cloud Data Engineer and Cloud Architect are preferred. You should have a proven track record of designing scalable AI/ML systems in production, focusing on high-performance and cost-effective solutions. Strong experience with cloud platforms (Google Cloud, AWS, Azure) and cloud-native AI/ML services like Vertex AI and SageMaker is important. Your role will involve implementing MLOps practices, including model deployment, monitoring, retraining, and version control. Leadership skills are key to guide teams, mentor engineers, and collaborate effectively with cross-functional teams to achieve business objectives. A deep understanding of frameworks like TensorFlow, PyTorch, and Scikit-learn for designing, training, and deploying models is necessary. Experience with data engineering principles, scalable pipelines, and distributed systems (e.g., Apache Kafka, Spark, Kubernetes) is also required. Nice to have requirements include strong leadership and mentorship capabilities to guide teams towards best practices and high-quality deliverables. Excellent problem-solving skills focusing on designing efficient, high-performance systems are valued. Effective project management abilities are necessary to handle multiple initiatives and ensure timely delivery. Collaboration and teamwork are emphasized to foster a positive and productive work environment.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
The AIML Architect-Dataflow, BigQuery plays a crucial role within the organization by focusing on designing, implementing, and optimizing data architectures in Google Cloud's BigQuery environment. Your primary responsibility will involve combining advanced data analytics with artificial intelligence and machine learning techniques to create efficient data models that improve decision-making processes across various departments. Building data pipeline solutions that utilize BigQuery and Dataflow functionalities to ensure high performance, scalability, and resilience in data workflows will be key. Collaboration with data engineers, data scientists, and application developers is essential to align technical vision with business goals. Your expertise in cloud-native architectures will be instrumental in driving innovation, efficiency, and insights from vast datasets. The ideal candidate will have a strong background in data processing and AI/ML methodologies and be adept at translating complex technical requirements into scalable solutions that meet the organization's evolving needs. Responsibilities: - Design and architect data processing solutions using Google Cloud BigQuery and Dataflow. - Develop data pipeline frameworks supporting batch and real-time analytics. - Implement machine learning algorithms to extract insights from large datasets. - Optimize data storage and retrieval processes to enhance performance. - Collaborate with data scientists to build scalable models. - Ensure data quality and integrity throughout the data lifecycle. - Align data workflows with business objectives through collaboration with cross-functional teams. - Conduct technical evaluations of new tools and technologies. - Manage large-scale data migrations to cloud environments. - Document architecture designs and maintain technical specifications. - Provide mentorship to junior data engineers and analysts. - Stay updated with industry trends in cloud computing and data engineering. - Design and implement security best practices for data access and storage. - Monitor and troubleshoot data pipeline performance issues. - Conduct training sessions on BigQuery best practices for team members. Requirements: - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. - 5+ years of experience in data architecture and engineering. - Proficiency in Google Cloud Platform, specifically BigQuery and Dataflow. - Strong understanding of data modeling and ETL processes. - Experience implementing machine learning solutions in cloud environments. - Proficient in programming languages like Python, Java, or Scala. - Expertise in SQL and query optimization techniques. - Familiarity with big data workloads and distributed computing. - Knowledge of modern data processing frameworks and tools. - Strong analytical and problem-solving skills. - Excellent communication and team collaboration abilities. - Track record of managing comprehensive projects from inception to completion. - Ability to work in a fast-paced, agile environment. - Understanding of data governance, compliance, and security. - Experience with data visualization tools is a plus. - Certifications in Google Cloud or relevant technologies are advantageous.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
The team member is expected to work directly with the client in a fast-paced working environment to address their day-to-day analytics needs. You will need to address business problems by incorporating the business context to design and develop solutions that leverage statistical & advanced analytics methodologies. You should be able to perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Your responsibilities will also include building efficiencies through standardization and automation. The indicative areas of work could be Digital Analytics - Implementing, auditing, and leveraging GA4 to measure and enhance product usage. Utilizing GTM to track custom events and visualizing data through Looker Studio via BigQuery. User Analytics - Understanding user behavior in the app, feature adoption, understanding retention and churn drivers, transaction behavior across features, offer and scratch card redemption, impact of email and notifications. Marketing measurement - Evaluating the effectiveness of marketing campaigns through channels like ATL and performance media. If you enjoy wild growth and working with happy, enthusiastic over-achievers, you'll find your career fulfilling with us. If this opportunity is not the right fit for you, you can express your interest in future opportunities by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts for new job postings that align with your interests.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
Apply Digital is a global digital transformation partner for change agents, offering expertise in Business Transformation Strategy, Product Design & Development, Commerce, Platform Engineering, Data Intelligence, and Change Management. Our goal is to help clients modernize their organizations and deliver meaningful impact to both their business and customers, whether they are initiating, accelerating, or optimizing their digital transformation journey. We specialize in implementing composable tech and leveraging our experience in building smart products and utilizing AI tools to drive value. With over 650 team members, we have successfully transformed global companies like Kraft Heinz, NFL, Moderna, Lululemon, Atlassian, Sony, American Express, and Harvard Business School. Founded in 2016 in Vancouver, Canada, Apply Digital has expanded to nine cities across North America, South America, the UK, and Europe. We are excited to launch a new office in Delhi NCR, India. At Apply Digital, we embrace the "One Team" approach, operating within a "pod" structure that brings together senior leadership, subject matter experts, and cross-functional skill sets. Our teams work within a common tech and delivery framework supported by well-organized scrum and sprint cadences, ensuring alignment towards desired outcomes through regular retrospectives. We envision Apply Digital as a safe, empowered, respectful, and fun community wherever we operate globally. Our team strives to embody our SHAPE values - smart, humble, active, positive, and excellent - creating a space for connection, growth, and mutual support to make a difference every day. Apply Digital is a hybrid-friendly organization with remote options available. The preferred candidate for the role should be based in or within commutable distance to Delhi/NCR, India, working hours that overlap with the Eastern Standard Timezone (EST). The client is seeking an experienced Data Engineer to design, build, and maintain scalable data pipelines and architectures to support analytical and operational workloads. Responsibilities include developing and optimizing ETL/ELT pipelines, integrating data pipelines into cloud-native applications, managing cloud data warehouses, implementing data governance and security best practices, collaborating with analytics teams, maintaining data documentation, monitoring and optimizing data pipelines, and staying updated on emerging data engineering technologies. **Requirements:** - Strong proficiency in English (written and verbal communication). - Experience working with remote teams across different time zones. - 5+ years of data engineering experience with expertise in building scalable data pipelines. - Proficiency in SQL and Python for data modeling and processing. - Experience with Google Cloud Platform (GCP) and tools like BigQuery, Cloud Storage, and Pub/Sub. - Knowledge of ETL/ELT frameworks, workflow orchestration tools, data privacy, and security best practices. - Strong problem-solving skills and excellent communication abilities. **Nice to have:** - Experience with real-time data streaming solutions, machine learning workflows, BI tools, Terraform, and data integrations. - Knowledge of Infrastructure as Code (IaC) in data environments. Apply Digital offers comprehensive benefits including private healthcare coverage, contributions to Provident fund, gratuity bonus, flexible vacation policy, engaging projects with global impact, inclusive and safe work environment, learning opportunities, and a commitment to fostering an inclusive workplace. Apply Digital is dedicated to celebrating differences, promoting equal opportunity, and creating an inclusive culture where individual uniqueness is valued and recognized.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Sales Excellence COE Advanced Modeling Manager at Accenture, you will be part of a team that empowers individuals to compete, win, and grow by providing the necessary tools to build client portfolios, optimize deals, and enhance sales talent through sales intelligence. Your role will involve utilizing machine learning algorithms, SQL, R or Python, Advanced Excel, and data visualization tools like Power Bi, Power Apps, Tableau, QlikView, and Google Data Studio. Additionally, familiarity with Google Cloud Platform (GCP), BigQuery, Salesforce Einstein Analytics, and optimization techniques such as Pyomo, SciPy, PuLP, Gurobi, and CPLEX is beneficial. In this position, you will work within the Center of Excellence Analytics Modeling Analysis team to generate insights that support Accenture in enhancing processes, increasing sales, and maximizing profitability. Your responsibilities will include collecting and processing data from various functions, analyzing data to develop business insights, communicating these insights effectively, and collaborating with the team to create practical solutions. The ideal candidate should possess a Bachelor's degree or equivalent experience, excellent English communication skills, and a minimum of five years of experience in data modeling and analysis. Proficiency in project management, business acumen, and attention to detail are essential for success in this role. A Master's degree in Analytics or a related field, understanding of sales processes and systems, knowledge of Google Cloud Platform, experience in Sales, Marketing, Pricing, Finance, or related domains, familiarity with Salesforce Einstein Analytics, and expertise in optimization techniques and packages are additional qualifications that would be advantageous. Join us at Accenture and be part of a dynamic team that is dedicated to enabling sales excellence and driving growth.,
Posted 2 weeks ago
15.0 - 21.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Data Architect with over 15 years of experience, your primary responsibility will be to lead the design and implementation of scalable, secure, and high-performing data architectures. You will collaborate with business, engineering, and product teams to develop robust data solutions that support business intelligence, analytics, and AI initiatives. Your key responsibilities will include designing and implementing enterprise-grade data architectures using cloud platforms such as AWS, Azure, or GCP. You will lead the definition of data architecture standards, guidelines, and best practices while architecting scalable data solutions like data lakes, data warehouses, and real-time streaming platforms. Collaborating with data engineers, analysts, and data scientists, you will ensure optimal solutions are delivered based on data requirements. In addition, you will oversee data modeling activities encompassing conceptual, logical, and physical data models. It will be your duty to ensure data security, privacy, and compliance with relevant regulations like GDPR and HIPAA. Defining and implementing data governance strategies alongside stakeholders and evaluating data-related tools and technologies are also integral parts of your role. To excel in this position, you should possess at least 15 years of experience in data architecture, data engineering, or database development. Strong experience in architecting data solutions on major cloud platforms like AWS, Azure, or GCP is essential. Proficiency in data management principles, data modeling, ETL/ELT pipelines, and modern data platforms/tools such as Snowflake, Databricks, and Apache Spark is required. Familiarity with programming languages like Python, SQL, or Java, as well as real-time data processing frameworks like Kafka, Kinesis, or Azure Event Hub, will be beneficial. Moreover, experience in implementing data governance, data cataloging, and data quality frameworks is important. Knowledge of DevOps practices, CI/CD pipelines for data, and Infrastructure as Code (IaC) is a plus. Excellent problem-solving, communication, and stakeholder management skills are necessary for this role. A Bachelor's or Master's degree in Computer Science, Information Technology, or a related field is preferred, along with certifications like Cloud Architect or Data Architect (AWS/Azure/GCP). Join us at Infogain, a human-centered digital platform and software engineering company, where you will have the opportunity to work on cutting-edge data and AI projects in a collaborative and inclusive work environment. Experience competitive compensation and benefits while contributing to experience-led transformation for our clients in various industries.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You are an experienced Senior QA Specialist being sought to join a dynamic team for a critical AWS to GCP migration project. Your primary responsibility will involve the rigorous testing of data pipelines and data integrity in GCP cloud to ensure seamless reporting and analytics capabilities. Your key responsibilities will include designing and executing test plans to validate data pipelines re-engineered from AWS to GCP, ensuring data integrity and accuracy. You will work closely with data engineering teams to understand AVRO, ORC, and Parquet file structures in AWS S3, and analyze the data in external tables created in Athena used for reporting. It will be essential to ensure that schema and data in Bigquery match against Athena to support reporting in PowerBI. Additionally, you will be required to test and validate Spark pipelines and other big data workflows in GCP. Documenting all test results and collaborating with development teams to resolve discrepancies will also be part of your responsibilities. Furthermore, providing support to UAT business users during UAT testing is expected. To excel in this role, you should possess proven experience in QA testing within a big data DWBI ecosystem. Strong familiarity with cloud platforms such as AWS, GCP, or Azure, with hands-on experience in at least one is necessary. Deep knowledge of data warehousing solutions like BigQuery, Redshift, Synapse, or Snowflake is essential. Expertise in testing data pipelines and understanding different file formats like Avro and Parquet is required. Experience with reporting tools such as PowerBI or similar is preferred. Your excellent problem-solving skills and ability to work independently will be valuable, along with strong communication skills and the ability to collaborate effectively across teams.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Product Owner for the GCP Data Migration Project at Clairvoyant, you will play a crucial role in leading the initiative and ensuring successful delivery of data migration solutions on Google Cloud Platform. With your deep understanding of cloud platforms, data migration processes, and Agile methodologies, you will collaborate with cross-functional teams to define the product vision, gather requirements, and prioritize backlogs to align with business objectives and user needs. Your key responsibilities will include defining and communicating the product vision and strategy, leading requirement gathering sessions with stakeholders, collaborating with business leaders and technical teams to gather and prioritize requirements, creating user stories and acceptance criteria, participating in sprint planning, establishing key performance indicators, identifying and mitigating risks, and fostering a culture of continuous improvement through feedback collection and iteration on product features and processes. To be successful in this role, you should have 10-12 years of experience in product management or product ownership, particularly in data migration or cloud projects. You must possess a strong understanding of Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, and Data Transfer Services, as well as experience with data migration strategies and tools including ETL processes and data integration methodologies. Proficiency in Agile methodologies, excellent analytical and problem-solving skills, strong communication skills, and a Bachelor's degree in Computer Science, Information Technology, Business, or a related field are essential qualifications. Additionally, experience with data governance and compliance in cloud environments, familiarity with project management and collaboration tools like JIRA and Confluence, understanding of data architecture and database management, and Google Cloud certifications such as Professional Cloud Architect and Professional Data Engineer are considered good to have qualifications. At Clairvoyant, we provide opportunities for engineers to develop and grow, work with a team of hardworking and dedicated peers, and offer growth and mentorship opportunities. We value diversity and encourage individuals with varying skills and qualities to apply, as we believe there might be a suitable role for you in the future. Join us in driving innovation and growth in the technology consulting and services industry!,
Posted 2 weeks ago
12.0 - 17.0 years
27 - 35 Lacs
Madurai, Chennai
Work from Office
Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra Clients Vision is our Mission. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Job Title: GCP Data Architect Location: Madurai/Chennai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications: GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer: Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership
Posted 2 weeks ago
5.0 - 8.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Mandatory Skills: Google BigQuery. Experience: 5-8 Years.
Posted 2 weeks ago
1.0 - 4.0 years
2 - 6 Lacs
Gurugram
Work from Office
- Desired technical and interpersonal skills include, but are not limited to: 1.BE with hands on experience in Cisco technologies 2.CCNA and/or CCNP Routing & Switching certifications (preferred) 3.Strong communication skills 4.Very Good understanding on Cisco Architectures (EN/Sec/SP) and Solutions 5.Desire and ability to learn new technology and solutions. Specialized experience requirements:- 6+ Years of experience on any one EN/Sec/SP Architecture Understanding and hands on experience preferred in the detailed sub technologies in that Architecture Ability to understand and capture technical as well as business requirements Self-starter with excellent Presentation skills and consultative skills Strong Analytical, Communication both written and verbal Business
Posted 2 weeks ago
4.0 - 5.0 years
10 - 12 Lacs
Bengaluru
Work from Office
Position Overview We are seeking an experienced Full Stack Developer to join our dynamic development team. The ideal candidate will have strong expertise in modern web technologies, cloud platforms, and data analytics tools to build scalable, high-performance applications. Key Responsibilities Frontend Development Develop responsive and interactive user interfaces using React.js and modern JavaScript (ES6+) Implement state management solutions using Redux, Context API, or similar frameworks Ensure cross-browser compatibility and optimize applications for performance Collaborate with UX/UI designers to translate mockups into functional components Write clean, maintainable, and well-documented frontend code Backend Development Design and develop RESTful APIs and microservices using Node.js and Express.js Implement authentication and authorization mechanisms (JWT, OAuth) Build real-time applications using WebSocket or Socket.io Optimize server-side performance and handle database operations efficiently Develop and maintain middleware for logging, error handling, and security Database Management Design and optimize SQL database schemas (PostgreSQL, MySQL, or SQL Server) Write complex queries, stored procedures, and database functions Implement data migration scripts and maintain database integrity Monitor database performance and implement optimization strategies Big Data & Analytics Develop data pipelines and ETL processes using Google BigQuery Create and optimize complex SQL queries for large datasets Implement data visualization solutions and reporting dashboards Work with streaming data and batch processing workflows Cloud Infrastructure Deploy and manage applications on Azure or Google Cloud Platform (GCP) Implement CI/CD pipelines using cloud-native tools Configure and manage cloud databases, storage solutions, and networking Monitor application performance and implement auto-scaling solutions Ensure security best practices and compliance requirements Required Qualifications Technical Skills 4-5 years of professional experience in full stack development Proficiency in Node.js, Express.js, and JavaScript/TypeScript Strong experience with React.js, HTML5, CSS3, and modern frontend frameworks Solid understanding of SQL databases and query optimization Hands-on experience with Google BigQuery for data analytics Experience with cloud platforms (Azure or GCP) including deployment and management Knowledge of version control systems (Git) and collaborative development workflows Additional Requirements Experience with containerization technologies (Docker, Kubernetes) Understanding of microservices architecture and API design principles Familiarity with testing frameworks (Jest, Mocha, Cypress) Knowledge of security best practices and data protection regulations Experience with monitoring and logging tools (Application Insights, Stackdriver)
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi