Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1 - 4 years
3 - 6 Lacs
Varanasi
Work from Office
Job Title: Technical Writer Location: Head Office Varanasi Uttar Pradesh Department: Information & Communication Technology Salary: Negotiable About the role: As a technical writer, you will play a crucial role in our organization by translating complex technical information into clear, concise, and user-friendly documentation. Your primary responsibility will be to create, edit, and maintain various types of technical content, such as user manuals, DFDs, SOPs, SRS, meeting minutes, and other related technical documents. You will collaborate closely with subject matter experts, software developers, and product managers to collect information and ensure accuracy in your documentation. Responsibilities of a Technical Writer: Develop comprehensive documentation that meets organizational standards. Obtain a deep understanding of products and services to translate complex product information into simple, polished, and engaging content. Write user-friendly content that meets the needs of the target audience, providing product insights in a simple language that sets our users up for success. Research, outline, write, and edit new content, working closely with various departments to understand project requirements. Independently gather information from subject matter experts to develop, organize, and write procedure manuals, technical specifications, and process documentation Research, create, and maintain information architecture templates that uphold organizational and legal standards and allow for easy data migration. Use photographs, drawings, diagrams, animation, and charts that increase users understanding. Gather usability feedback from customers, designers, and manufacturers. Revise documents as new issues arise. Skill sets and experience required for technical writers: Proven ability to handle multiple projects simultaneously, with an eye for prioritization. Firm understanding of the Software Development Life Cycle (SDLC) Well-versed in networking and servers. Experience using tools to create documentation (RoboHelp, MS Word) Proven ability to quickly learn and understand complex topics. Previous experience writing documentation and procedural materials for multiple audiences. Superior written and verbal communication skills. Experience working with engineering teams and helping refine content and create visuals and diagrams for technical support content. Eligibility Criteria: Education: BCA, MCA, or B. Tech (CS/IT) or a related field. Experience: 2+ years of experience in Tech writer Skills: Strong knowledge of Technical Documentation, Firm Understanding of SDLC, able to create diagram for technical support content. Communication: Excellent verbal and written communication skills to liaise with technical and non-technical stakeholders. How to Apply: Email Application: Send your CV to hr19@cashpor.in with the subject line: Applying for the position of Tech Writer" CC the HR Team: Include hr20@cashpor.in and hr35@cashpor.in in the CC field. LinkedIn Application: Apply through LinkedIn as well . URL for LinkedIn :- (14) Technical Writer | Cashpor Micro Credit | LinkedIn Await Response: The HR team will contact shortlisted candidates for further steps. Mode of Interview: Technical Assessment Personal Interview If you're an IT professional eager to work in a fast-paced information Security environment , apply now and help us fortify our digital defenses! Regards, Devendra Pratap Singh Sr. Manager - HRD CASHPOR Micro Credit
Posted 1 month ago
4 - 9 years
10 - 14 Lacs
Pune
Hybrid
Job Description; Technical Skills; Top skills for this positions is : Google Cloud Platform (Composer, Big Query, Airflow, DataProc, Data Flow, GCS) Data Warehousing knowledge Hands on experience in Python language and SQL database. Analytical technical skills to be able to predict the consequences of configuration changes (impact analysis), to identify root causes that are not obvious and to understand the business requirements. Excellent communication with different stakeholders (business, technical, project) Good understading of the overall Big Data and Data Science ecosystem Experience with buiding and deploying containers as services using Swarm/Kubernetes Good understanding of container concepts like buiding lean and secure images Understanding modern DevOps pipelines Experience with stream data pipelines using Kafka or Pub/Sub (mandatory for Kafka resources) Good to have: Professional Data Engineer or Associate Data Engineer Certification Roles and Responsibilities; Design, build & manage Big data ingestion and processing applications on Google Cloud using Big Query, Dataflow, Composer, Cloud Storage, Dataproc Performance tuning and analysis of Spark, Apache Beam (Dataflow) or similar distributed computing tools and applications on Google Cloud Good understanding of google cloud concepts, environments and utilities to design cloud optimal solutions for Machine Learning Applications Build systems to perform real-time data processing using Kafka, Pub-sub, Spark Streaming or similar technologies Manage the development life-cycle for agile software development projects Convert a proof of concept into an industrialization for Machine Learning Models (MLOps). Provide solutions to complex problems. Deliver customer-oriented solutions in a timely, collaborative manner Proactive thinking, planning and understanding of dependencies Develop & implement robust solutions in test & production environments.
Posted 1 month ago
4 - 7 years
6 - 16 Lacs
Bengaluru
Work from Office
Senior Software Engineer Google Cloud Platform (GCP) Location: Bangalore, India Why Join Fossil Group? At Fossil Group, we are part of an international team that dares to dream, disrupt, and deliver innovative watches, jewelry, and leather goods to the world. We're committed to long-term value creation, driven by technology and our core values: Authenticity, Grit, Curiosity, Humor, and Impact. If you are a forward-thinker who thrives in a diverse, global setting, we want to hear from you. Make an Impact (Job Summary + Responsibilities) We are looking for a Senior Software Engineer – GCP to join our growing Cloud & Data Engineering team at Fossil Group . This role involves building scalable cloud-native data pipelines using Google Cloud Platform services, with a focus on Dataflow, Dataproc, BigQuery , and strong development skills in Java, Python, and SQL . You will collaborate with global architects and business teams to design and deploy innovative solutions, supporting data analytics, automation, and transformation needs. What you will do in this role: Design, develop, deploy, and maintain data pipelines and services using GCP technologies including Dataflow, Dataproc, BigQuery, Composer , and others. Translate blueprinting documents and business requirements into scalable and maintainable GCP configurations and solutions. Develop and enhance cloud-based batch/streaming jobs using Java or Python. Collaborate with global architects and cross-functional teams to define solutions and execute projects across development and testing environments. Perform unit testing, integration testing, and resolve issues arising during the QA lifecycle. Work closely with internal stakeholders to gather requirements, present technical solutions, and provide end-to-end delivery. Own and manage project timelines, priorities, and documentation. Continuously improve processes and stay current with GCP advancements and big data technologies. Who You Are (Requirements) Bachelor's degree in Computer Science or related field. 4-7 years of experience as a DB/SQL Developer or Java/Python Developer with strong SQL capabilities. Hands-on experience with GCP components such as Dataflow, Dataproc, BigQuery, Cloud Functions, Composer, Data Fusion . Excellent command of SQL with the ability to write complex queries and perform advanced data transformation. Strong programming skills in Java and/or Python , specifically for building cloud-native data pipelines. Experience with relational and NoSQL databases. Ability to understand and translate business requirements into functional specifications. Familiarity with BI dashboards and Google Data Studio is a plus. Strong problem-solving, communication, and collaboration skills. Self-directed, with a growth mindset and eagerness to upskill in emerging GCP technologies. Comfortable leading meetings, gathering requirements, and managing stakeholder communication across regions. What We Offer Comprehensive Benefits: Includes health and well-being services. Paid Parental Leave & Return to Work Program: Support for new parents and caregivers with paid leave and a flexible phase-back schedule. Generous Paid Time Off: Includes Sick Time, Personal Days, and Summer Flex Fridays. Employee Discounts: Save on Fossil merchandise. EEO Statement At Fossil, we believe our differences not only make us stronger as a team, but also help us create better products and a richer community. We are an Equal Employment Opportunity Employer dedicated to a policy of non-discrimination in all employment practices without regard to age, disability, gender identity or expression, marital status, pregnancy, race, religion, sexual orientation, or any other protected characteristic.
Posted 1 month ago
10 - 20 years
25 - 40 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Hi, Hope you are looking for a job change. We have opening for GCP Data Architect for an MNC in Pan India Location, I'm sharing JD with you. Please have a look and revert with below details and Updated Resume. Apply only if you can join in 10 Days. Its 5-Days Work from Office . We dont process High-Notice period Candidates Role GCP Data Architect Experience: 10+ Years Mode: Permanent Work Location: Pan India Notice Period: immediate to 10 Days Mandatory Skills: : GCP, Architecture Experience , Big Data, Data Modelling , BigQuery Full Name ( As Per Aadhar Card ): Email ID: Mobile Number: Alternate No: Qualification: Graduation Year: Regular Course: Total Experience: Relevant experience: Current Organization: Working as Permanent Employee: Payroll Company:: Experience in GCP : Experience in Architecture : Experience in GCP Data Architecture : Experience in Big Data: Experience in BigQuery : Experience in Data Management : Official Notice period: Serving Notice Period: Current location: Preferred location: Current CTC: Exp CTC: CTC Breakup: Pan Card Number : Date of Birth : Any Offer in Hand: Offered CTC: LWD: Any Offer in hand: Serving Notice Period: Can you join immediately: Ready to work from Office for 5 Days : Job Description: GCP Data Architect We are seeking a skilled Data Solution Architect to design Solution and Lead the implementation on GCP. The ideal candidate will possess extensive experience in data Architecting, Solution design and data management practices. Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions. Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, GCS, Service Accounts, cloud function Extremely strong in BigQuery design, development Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills. Regards, Rejeesh S Email : rejeesh.s@jobworld.jobs Mobile : +91 - 9188336668
Posted 1 month ago
1 - 5 years
12 - 17 Lacs
Hyderabad
Work from Office
Job Area: Information Technology Group, Information Technology Group > IT Data Engineer General Summary: Developer will play an integral role in the PTEIT Machine Learning Data Engineering team. Design, develop and support data pipelines in a hybrid cloud environment to enable advanced analytics. Design, develop and support CI/CD of data pipelines and services. - 5+ years of experience with Python or equivalent programming using OOPS, Data Structures and Algorithms - Develop new services in AWS using server-less and container-based services. - 3+ years of hands-on experience with AWS Suite of services (EC2, IAM, S3, CDK, Glue, Athena, Lambda, RedShift, Snowflake, RDS) - 3+ years of expertise in scheduling data flows using Apache Airflow - 3+ years of strong data modelling (Functional, Logical and Physical) and data architecture experience in Data Lake and/or Data Warehouse - 3+ years of experience with SQL databases - 3+ years of experience with CI/CD and DevOps using Jenkins - 3+ years of experience with Event driven architecture specially on Change Data Capture - 3+ years of Experience in Apache Spark, SQL, Redshift (or) Big Query (or) Snowflake, Databricks - Deep understanding building the efficient data pipelines with data observability, data quality, schema drift, alerting and monitoring. - Good understanding of the Data Catalogs, Data Governance, Compliance, Security, Data sharing - Experience in building the reusable services across the data processing systems. - Should have the ability to work and contribute beyond defined responsibilities - Excellent communication and inter-personal skills with deep problem-solving skills. Minimum Qualifications: 3+ years of IT-related work experience with a Bachelor's degree in Computer Engineering, Computer Science, Information Systems or a related field. OR 5+ years of IT-related work experience without a Bachelor"™s degree. 2+ years of any combination of academic or work experience with programming (e.g., Java, Python). 1+ year of any combination of academic or work experience with SQL or NoSQL Databases. 1+ year of any combination of academic or work experience with Data Structures and algorithms. 5 years of Industry experience and minimum 3 years experience in Data Engineering development with highly reputed organizations- Proficiency in Python and AWS- Excellent problem-solving skills- Deep understanding of data structures and algorithms- Proven experience in building cloud native software preferably with AWS suit of services- Proven experience in design and develop data models using RDBMS (Oracle, MySQL, etc.) Desirable - Exposure or experience in other cloud platforms (Azure and GCP) - Experience working on internals of large-scale distributed systems and databases such as Hadoop, Spark - Working experience on Data Lakehouse platforms (One House, Databricks Lakehouse) - Working experience on Data Lakehouse File Formats (Delta Lake, Iceberg, Hudi) Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field.
Posted 1 month ago
11 - 18 years
40 - 75 Lacs
Gurugram
Work from Office
Were looking for Big Data and ML Solutions Architect who expects more from their career 11 + years experience in building large scale products, distributed systems in a high calibre environment. Architecture : Identify and solve major architectural problems by going deep in your field or broad across different teams. Extend, improve, or, when needed, build solutions to address architectural gaps or technical debt. Software Engineering/Programming : Create frameworks and abstractions that are reliable and reusable. sophisticated knowledge of at least one programming language, and are happy to learn more. (Java, Python, Scala) Role & responsibilities Proven success architecting and scaling complex software solutions, Familiarity with interface design. Experience and ability to drive a project/module independently from an execution stand. Prior experience with scalable architecture and distributed processing. Strong Programming expertise in Python, SQL, Scala Hands-on experience on any major big data solutions like Spark, Kafka, Hive. Strong data management skills with ETL, DWH, Data Quality and Data Governance. Hands-on experience on microservices architecture, docker and Kubernetes as orchestration. Experience on cloud-based data stores like Redshift and Big Query. Experience in cloud solution architecture. Experience on architecture of running spark jobs on k8s and optimization of spark jobs. Experience in MLops architecture/tools/orchestrators like Kubeflow, MLflow Experience in logging, metrics and distributed tracing systems (e.g. Prometheus/Grafana/Kibana) Experience in CI/CD using octopus/teamcity/jenkins Preferred Candidate :- ONLY FEMALE Interested candidates can share their updated resumes at surinder.kaur@mounttalent.com
Posted 1 month ago
11 - 20 years
45 - 75 Lacs
Gurugram
Work from Office
Role & responsibilities Proven success architecting and scaling complex software solutions, Familiarity with interface design. Experience and ability to drive a project/module independently from an execution stand. Prior experience with scalable architecture and distributed processing. Strong Programming expertise in Python, SQL, Scala Hands-on experience on any major big data solutions like Spark, Kafka, Hive. Strong data management skills with ETL, DWH, Data Quality and Data Governance. Hands-on experience on microservices architecture, docker and Kubernetes as orchestration. Experience on cloud-based data stores like Redshift and Big Query. Experience in cloud solution architecture. Experience on architecture of running spark jobs on k8s and optimization of spark jobs. Experience in MLops architecture/tools/orchestrators like Kubeflow, MLflow Experience in logging, metrics and distributed tracing systems (e.g. Prometheus/Grafana/Kibana) Experience in CI/CD using octopus/teamcity/jenkins Interested candidate can share their updated resume at surinder.kaur@mounttalent.com
Posted 1 month ago
6 - 10 years
20 - 25 Lacs
Hyderabad
Work from Office
Roles and Responsibilities: Drive End to End program management of the initiatives from requirements to delivery for wide range of customers. Responsible to prepare User Stories, Data Flow, Business Requirement Document (BRD), Functional Requirement Specification (FRS) and Use-cases, for every initiative. Construct workflow charts and diagrams in collaboration with Customers and ZINFI Product. Create detailed plans for execution and implementation of new processes taking customer into confidence. Monitor project progress and perform daily, weekly and monthly reviews and analyses of current processes using operational metrics and reports as mandated by the customer. Communicate with team heads regarding common challenges, roadblocks and other issues that interrupt their workflow Producing detailed costing for customers and ensuring the contract is profitable. Ensuring that the company's product &features can deliver on the customer's requirements. Requirements: Ability to impact operations and effect change without being confrontational Detail oriented, analytical, and inquisitive Ability to work independently and with others Extremely organized with strong time-management skills Excellent communication skills and ability to explain complex issues Excellent Project Management skills (Aha, MSProject,JIRA,ADO) Customer Awareness: Ability to understand the customer, their needs, their workflows, their business, potential impact opportunities and their KPIs. Business Awareness: Ability to understand
Posted 1 month ago
5 - 10 years
20 - 25 Lacs
Bengaluru
Work from Office
About The Role : Job Title Transformation Principal Change Analyst Corporate TitleAVP LocationBangalore, India Role Description We are looking for an experienced Change Manager to lead a variety of regional/global change initiatives. Utilizing the tenets of PMI, you will lead cross-functional initiatives that transform the way we run our operations. If you like to solve complex problems, have a gets things done attitude and are looking for a highly visible dynamic role where your voice is heard and your experience is appreciated, come talk to us What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Responsible for change management planning, execution and reporting adhering to governance standards ensuring transparency around progress status; Using data to tell the story, maintain risk management controls, monitor and communicate initiatives risks; Collaborate with other departments as required to execute on timelines to meet the strategic goals As part of the larger team, accountable for the delivery and adoption of the global change portfolio including by not limited to business case development/analysis, reporting, measurements and reporting of adoption success measures and continuous improvement. As required, using data to tell the story, participate in Working Group and Steering Committee to achieve the right level of decision making and progress/ transparency, establishing strong partnership and collaborative relationships with various stakeholder groups to remove constraints to success and carry forward to future projects. As required, developing and documenting end-to-end roles and responsibilities, including process flow, operating procedures, required controls, gathering and documenting business requirements (user stories)including liaising with end-users and performing analysis of gathered data. Heavily involved in product development journey Your skills and experience Overall experience of at least 7-10 years leading complex change programs/projects, communicating and driving transformation initiatives using the tenets of PMI in a highly matrixed environment Banking / Finance/ regulated industry experience of which at least 2 years should be in change / transformation space or associated with change/transformation initiatives a plus Knowledge of client lifecycle processes, procedures and experience with KYC data structures / data flows is preferred. Experience working with management reporting is preferred. Bachelors degree How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
2 - 6 years
8 - 12 Lacs
Pune
Work from Office
About The Role : Job Title Senior engineer - (Data Engineer (ETL, Big Data, Hadoop, Spark, GCP), AVP Location:Pune, India Role Description Senior engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Work as a senior developer for developing analytics algorithm on top of ingested data. Work as though senior developer for various data sourcing in Hadoop also GCP. Own unit testing UAT deployment end user sign off and prod go live. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience More than 10+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark, SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Desired Banking experience regulatory and cross product knowledge. Passionate about test driven development. Prior experience with release management tasks and responsibilities. Data visualization experience is good to have. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.
Posted 1 month ago
3 - 7 years
10 - 14 Lacs
Pune
Work from Office
About The Role : Job Title GCP Data Engineer, AS LocationPune, India Role Description An Engineer is responsible for designing and developing entire engineering solutions to accomplish business goals. Key responsibilities of this role include ensuring that solutions are well architected, with maintainability and ease of testing built in from the outset, and that they can be integrated successfully into the end-to-end business process flow. They will have gained significant experience through multiple implementations and have begun to develop both depth and breadth in several engineering competencies. They have extensive knowledge of design and architectural patterns. They will provide engineering thought leadership within their teams and will play a role in mentoring and coaching of less experienced engineers. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, develop and maintain data pipelines using Python and SQL programming language on GCP. Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills. Work with Cloud Composer to manage and process batch data jobs efficiently. Develop and optimize complex SQL queries for data analysis, extraction, and transformation. Develop and deploy google cloud services using Terraform. Implement CI CD pipeline using GitHub Action Consume and Hosting REST API using Python. Monitor and troubleshoot data pipelines, resolving any issues in a timely manner. Ensure team collaboration using Jira, Confluence, and other tools. Ability to quickly learn new any existing technologies Strong problem-solving skills. Write advanced SQL and Python scripts. Certification on Professional Google Cloud Data engineer will be an added advantage. Your skills and experience 6+ years of IT experience, as a hands-on technologist. Proficient in Python for data engineering. Proficient in SQL. Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run and well to have GKE Hands on experience in REST API hosting and consumptions. Proficient in Terraform/ Hashicorp. Experienced in GitHub and Git Actions Experienced in CI-CD Experience in automating ETL testing using python and SQL. Good to have APIGEE. Good to have Bit Bucket How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.
Posted 1 month ago
4 - 9 years
14 - 19 Lacs
Pune
Work from Office
About The Role : We are looking for a passionate and self-motivated Technology Leader to join our team in Accounting domain. Being part of a diverse multi-disciplinary global team, you will collaborate with other disciplines to shape technology strategy, drive engineering excellence and deliver business outcomes. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy * Best in class leave policy * Gender neutral parental leaves * 100% reimbursement under childcare assistance benefit (gender neutral) * Sponsorship for Industry relevant certifications and education * Employee Assistance Program for you and your family members * Comprehensive Hospitalization Insurance for you and your dependents * Accident and Term life Insurance * Complementary Health screening for 35 yrs. and above This role is responsible for Design and Implementation of the high-quality technology solutions. The candidate should have demonstrated technical expertise having excellent problem-solving skills. The candidate is expected to; be a hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities champion engineering best practices and guide/mentor team to achieve high performance. work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. acquire functional knowledge of the business capability being digitized/re-engineered. demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. focus on upskilling people, team building and career development. keeping up-to-date with industry trends and developments. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Advantageous: * Having prior experience in Banking/Finance domain * Having worked on hybrid cloud solutions preferably using GCP * Having worked on product development How we'll support you: * Training and development to help you excel in your career * Coaching and support from experts in your team * A culture of continuous learning to aid progression * A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
4 - 9 years
0 - 3 Lacs
Pune, Chennai, Mumbai (All Areas)
Work from Office
Role & responsibilities • Co-ordinate with Business and different teams during different phases of the project • Implement Compensations rules into SAP Commission using different components • Developing a profound knowledge of the structure and application of commission plans and compensation strategy • Develop understanding of data sources, data flows, payload models, transformation logic • Providing support once project go live and performing enhancement of the product Technical requirements • Technical experience in Application Development , with a focus on Compensation Systems • Good understanding of SAP portfolio and SAP Commissions components • Good understanding of Reporting, Dispute process, Data Integration with SAP Commissions Product Suite • Experience in integrating Commissions with existing Enterprise solutions, on-premises and Cloud • Knowledge of compensation plans and bonuses • Experience working in an Agile environment • Must have great communication skills both written and verbal • Confident and outgoing in approach, good team player / Team Lead • Should be comfortable working with Business team and setting right expectations with Business team
Posted 1 month ago
6 - 11 years
0 - 1 Lacs
Pune, Chennai, Bengaluru
Work from Office
Role: GCP Data Engineer Experience: 6-12 yrs Location: Chennai,Hyderabad,Bangalore,Pune,Gurgaon Required Skillset =>Should have experience in Big query, Dataflow , Cloud SQL , Cloud composer =>Should have experience in Python , Vertex and Data flow Interested candidates can send resume to jegadheeswari.m@spstaffing.in or reach me @9566720836
Posted 1 month ago
5 - 10 years
9 - 13 Lacs
Chennai
Work from Office
Overview GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) +Teradata Responsibilities GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata Requirements GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata
Posted 1 month ago
3 - 7 years
5 - 9 Lacs
Mumbai
Work from Office
About The Role The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems.The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Senior Process Manager Roles and responsibilities: Collaborate with stakeholders to gather and analyze business requirements. Utilize data skills to extract, transform, and analyze data from various sources. Interpret data to identify trends, patterns, and insights. Generate comprehensive reports to present findings to stakeholders. Document business processes, data flows, and requirements. Assist in the development and implementation of data-driven solutions. Conduct ad-hoc analysis as required to support business initiatives Technical and Functional Skills: Bachelors Degree with 5+ years of experience with 3+ years of hands-on experience as a Business Analyst or similar role. Strong data skills with the ability to manipulate and analyze complex datasets. Proficiency in interpreting data and translating findings into actionable insights. Experience with report generation and data visualization tools. Solid understanding of business processes and data flows. Excellent communication and presentation skills. Ability to work independently and collaboratively in a team environment. Basic understanding of Google Cloud Platform (GCP), Tableau, SQL, and Python is a plus. Certification in Business Analysis or related field. Familiarity with Google Cloud Platform (GCP) services and tools. Experience with Tableau for data visualization. Proficiency in SQL for data querying and manipulation. Basic knowledge of Python for data analysis and automation.
Posted 1 month ago
9 - 14 years
9 - 19 Lacs
Hyderabad, Gurugram
Work from Office
Hi everyone. Open Positions in the Business Analysis - CRM Subject Matter Expert Role Greetings from Tekaccel! This is an excellent opportunity with us. If you have that unique and unlimited passion for building world-class enterprise software products that turn into actionable intelligence, then we have the right opportunity for you and your career. What are we looking for? Skill: Business Analysis - CRM Subject matter expert Experience: 9+ years Location: Gurugram, Hyderabad Work mode: WFO Looking for immediate joiners Hire type: Contract to hire *** We have a requirement for 1 BA who is a CRM subject matter expert. Need not be from the telecom sector; if a legal background, that will definitely help. *** Shift TimingsMonday to Friday, UK Shift (IST 01:30 PM to 10:00 PM / IST 02:30 PM to 11:00 PM (daylight saving)). Job Description: -->An overall experience of around 9 years with a minimum of 7 years in a Business Analyst role. -->Proven experience in requirements gathering tools/techniques, including user stories, use cases, Process Modelling, and workshops/interviews. -->Proven experience in business analysis tasks on large-scale CRM implementations in professional services organisations. -->Data analysis skills (data flow diagrams, data modelling, data migration, and integration strategies). -->Excellent communication skills in order to engage, lead, and influence stakeholders at all levels of the business. If interested, candidates, please share your updated resume at naveen@tekaccel.com or WhatsApp at +91 7997763537 Tekaccel Software Services India
Posted 1 month ago
4 - 8 years
15 - 30 Lacs
Bengaluru
Remote
Job Title: Senior GCP Data DevOps Engineer Job Type: Remote Exp: 4+ years Position Overview: As a Senior DevOps Engineer specializing in Google Cloud Platform (GCP), you will play a crucial role in designing, implementing, and managing our cloud infrastructure to ensure optimal performance, scalability, and reliability. You will collaborate closely with cross-functional teams to streamline development processes, automate deployment pipelines, and enhance overall system efficiency. Responsibilities: Design, implement, and manage scalable and highly available cloud infrastructure on Google Cloud Platform (GCP) to support our applications and services. Develop and maintain CI/CD pipelines to automate the deployment, testing, and monitoring of applications and microservices. Collaborate with software engineering teams to optimize application performance, troubleshoot issues, and ensure smooth deployment processes. Implement and maintain infrastructure as code (IaC) using tools such as Terraform , Ansible, or Google Deployment Manager. Monitor system health, performance, and security metrics, and implement proactive measures to ensure reliability and availability. Implement best practices for security, compliance, and data protection in cloud environments. Continuously evaluate emerging technologies and industry trends to drive innovation and improve infrastructure efficiency. Mentor junior team members and provide technical guidance and support as needed. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. 4-8 years of experience in a DevOps role, with a focus on Google Cloud Platform (GCP). In-depth knowledge of GCP services such as Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL , Pub/Sub, and BigQuery . Proficiency in scripting languages such as Python , Bash, or PowerShell. Experience with containerization technologies such as Docker and container orchestration platforms like Kubernetes. Strong understanding of CI/CD concepts and experience with CI/CD tools such as Jenkins, GitLab CI/CD, or CircleCI. Solid understanding of infrastructure as code (IaC) principles and experience with tools such as Terraform, Ansible, or Google Deployment Manager. Experience with monitoring and logging tools such as Prometheus, Grafana, Stackdriver, or ELK Stack. Knowledge of security best practices and experience implementing security controls in cloud environments. Excellent problem-solving skills and ability to troubleshoot complex issues in distributed systems. Strong communication skills and ability to collaborate effectively with cross-functional teams. Preferred Qualifications: Google Cloud certification (e.g., Professional Cloud DevOps Engineer, Professional Cloud Architect). Experience with other cloud platforms such as AWS or Azure. Familiarity with agile methodologies and DevOps practices. Experience with software development using languages such as Java, Node.js, or Go. Knowledge of networking concepts and experience with configuring network services in cloud environments. Skills: Gcp CloudSQLBigqueryKubernetesIac ToolsCi Cd PipelineTerraformPythonAirflowSnowflakePower BiIacData FlowPubsubCloud StorageCloud Computing
Posted 1 month ago
8 - 13 years
12 - 20 Lacs
Bengaluru
Hybrid
Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : Google Cloud Platform Architecture Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Anu bachelors degree Summary: As a Cloud Platform Engineer, you will be responsible for designing, building, testing, and deploying cloud application solutions that integrate cloud and non-cloud infrastructure. You will deploy infrastructure and platform environments, create proof of architecture to test architecture viability, security, and performance. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the implementation of cloud solutions - Optimize cloud infrastructure for performance and cost-efficiency - Troubleshoot and resolve technical issues Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Platform Architecture - Strong understanding of cloud architecture principles - Experience with DevOps practices - Experience with Google Cloud SQL - Hands-on experience in cloud deployment and management - Knowledge of security best practices in cloud environments Additional Information: - The candidate should have a minimum of 7.5 years of experience in Google Cloud Platform Architecture - This position is based at our Bengaluru office - A bachelors degree is required
Posted 1 month ago
10 - 19 years
22 - 27 Lacs
Gurgaon
Work from Office
About The Role Template Job Title - GN - SONG - Service - CX - Value Architect – Senior Manager Management Level :06 - Senior Manager Location:Delhi, Gurgaon, Mumbai, Bangalore, Chennai, Pune, Hyderabad Must have skills:Value Realization Job Summary :As part of the team, you will provide transformation services driven by key offerings like Living Marketing, Connected Commerce and Advanced Customer Engagement. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner.Roles & Responsibilities: Translate strategic objectives into high-impact use cases in the specific area of expertise. Understand client's business priorities and focus areas to identify the right business scenarios and impacted value levers (KPIs) to include in the business case. Ideate and execute on compelling value creation workshops. Conduct detailed qualitative and quantitative research to lay the foundation of a strong business case. Own every stage of the value creation process, from research and identification to value drafting and dashboarding. Define value architecting requirements and work with Accenture teams to deliver solutions. Advise clients on industry best practices and client examples of value creation and realization Accurately estimate time to complete work. Continually experiment with new tools, technologies and sharpen analytical skills. Ability to research and provide strategic, goal-driven solutions for clients. Lead and Collaborate with both offshore & onshore cross functional and technical teams, including client-side managers, business heads, and other stakeholders across the organization. Provide useful contributions to team meetings and conversations, actively participating in client meetings and workshops- Ability to create hypothesis based on understanding of clients' issues.Professional & Technical Skills: Apply best of breed Excel practices- Deep-dive with solid knowledge of formulas & macros to bring in speed & efficiency. Maximize experience in developing interactive models:Use relevant dashboard creation platforms (Power BI, Tableau, etc.) to design and apply interactive dashboards. Innovate with Creativity:Demonstrate an ability to work in a fast-paced environment with the ability to abstract value into compelling business story. Participate in pre-sales activities including response to RFPs, creating proofs of concept, creating effective presentations, demonstrating solutions during client orals, effort and cost estimation process, etc. Participate in practice-specific initiatives including creating points of view, creating reusable assets on contact center space, performing analysis on industry research and market trends and bringing in innovative solutions, etc. Additional Information: 12+ years of experience in strategy/value office & consulting roles with P&L exposure Deep understanding of Customer Service function for two industries Good understanding of sales & marketing as a function Solid experience in developing quantitative models. Conducting qualitative & quantitative research Anchoring client/senior stakeholder conversations Creating engaging storyboards using the best data visualization tools such as Power BI, Tableau, etc. Passionate about story telling and excellent visual skills (using power point, figma tools) About Our Company | Accenture Qualification Experience:7 to 14 years Educational Qualification:MBA from a tier 1 institute
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20183 Jobs | Dublin
Wipro
10025 Jobs | Bengaluru
EY
8024 Jobs | London
Accenture in India
6531 Jobs | Dublin 2
Amazon
6260 Jobs | Seattle,WA
Uplers
6244 Jobs | Ahmedabad
Oracle
5916 Jobs | Redwood City
IBM
5765 Jobs | Armonk
Capgemini
3771 Jobs | Paris,France
Tata Consultancy Services
3728 Jobs | Thane