Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Description : Zeta Global is seeking a Solutions Associate for our Data Cloud Applications team to drive operational excellence, client support, and solution innovation. This role provides critical leverage to the team by supporting projects related to knowledge sharing, operational execution, and strategic solution enhancement. The Solutions Associate will work closely with Zetas key partners to help win new business, grow existing accounts, and maintain their competitive edge. They will have the autonomy to develop unique working models that best fit their strengths and workflow preferences while maintaining strong collaborating with the broader Zeta team and client stakeholders. The Solutions Associate will play a key role in informing Zetas product roadmap by capturing client feedback and identifying opportunities for greater efficiency and effectiveness. Success in this role will be measured by the ability to deliver on critical client requests and contribute meaningfully to client satisfaction and long-term growth. Roles & Responsibilities Develop a comprehensive understanding of the Zeta Data Cloud Identity Graph, attributes, and signals to support audience curation and data-related inquiries Demonstrate a deep understanding of Zetas Opportunity Explorer solutions, with the ability to demo these solutions internally and externally Identify strategic opportunities from Data Cloud Intelligence solutions and present actionable findings to client stakeholders during insight readouts. Act as a primary point of contact for Data Cloud-related questions from client account teams, providing accurate and timely support. Offer strategic recommendations during RFP responses, identifying creative applications of Zetas identity, intelligence, and activation solutions to differentiate client proposals. Train client account teams on how to leverage Data Cloud Intelligence solutions, enhancing client teams' ability to independently utilize platform features Support day-to-day Data Cloud operational requests, ensuring smooth execution of client initiatives Independently kick off and troubleshoot Data Cloud reports, ensuring timely and successful delivery to stakeholders. Audit and maintain client accounts, verifying that all requested solutions are accurately loaded and active. Capture client needs and feedback that align with the Zeta product roadmap, acting as a liaison between client teams and Zetas Product team. Advocate for client-driven enhancements, ensuring client needs are communicated clearly to influence future platform developments Qualifications Thrives in a challenging, fast-paced entrepreneurial environment with real-time impact on day-to-day business, championing a high agency mindset Highly organized and detail-oriented, with proven ability to manage multiple projects and prioritize effectively under dynamic conditions Analytical thinker, comfortable with quantitative analysis and data interpretation Translates complex data findings into clear, concise, and compelling narratives tailored to various audiences Creative problem-solver who can think outside the box to develop innovative solutions Collaborative team player with strong independent working skills; self-motivated and dependable in driving initiatives forward Proficient in Excel (VLookups, Pivot Tables, Logic-based queries, data cleaning & filtering) Advanced in Microsoft PowerPoint for professional client-facing presentations Preferred Qualifications Expert in Microsoft PowerPoint Proficient in Tableau Working understanding of SQL and relational databases
Posted 3 weeks ago
5.0 - 10.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Senior Software Engineer Data Were seeking a Senior Software Engineer or a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, locations and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Senior Software Engineer or a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5-10 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Django Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in Kafka or any other stream message processing solutions. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake
Posted 3 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
Technical Lead_Java fullstack AWS Fullstack Java+React+AWS Mandatory skills: Fullstack Java+React+AWS 6-10 years java development experience with JSE/JEE, Java based Micro-services framework and implementation, Spring framework, Hibernate framework, SQL etc Hands on experience on Spring boot & SPARK Microservices and OSGi specifications Hands on experience on ReactJS Strong knowledge of micro-service logging, monitoring, debugging and testing Implementations experience of micro-service integration, packaging, build automation and deployment At least two years of experience in SOA & Micro services based process applications using BPM (Activiti/JBPM/Camunda) Object Oriented analysis and design using common design patterns. Insight of Java and JEE internals (Class loading, Memory Management, Transaction management etc) Excellent knowledge of Relational Databases, SQL and ORM technologies (JPA2, Hibernate) Experience in developing web applications using at least one popular web framework (JSF, Wicket, GWT, Spring MVC, Spring Boot) Hands on experience with Relational and NOSQL databases (Mongo DB or Cassandra either one is must) Mandatory Skills : Fullstack Java Enterprise Experience : 5-8 Years.
Posted 3 weeks ago
1.0 - 3.0 years
2 - 6 Lacs
Chennai
Work from Office
Develop and execute test plans and cases to ensure software quality, identifying and reporting defects. Collaborate with developers to resolve issues, participate in code reviews, and maintain test documentation. Contribute to improving the QA process by applying testing best practices and utilizing bug tracking systems within the SDLC. Key Responsibilities Develop and execute test cases and test plans. Identify and report software defects. Perform functional, regression, and performance testing. Collaborate with developers to resolve issues. Participate in code reviews and provide feedback on testability. Document test results and maintain test documentation. Learn and apply software testing best practices. Work with bug tracking systems. Understand software development lifecycle (SDLC). Assist in creating and maintaining automated test scripts. Familiarity with testing tools and frameworks. Ability to analyze and interpret test results. Basic understanding of different testing methodologies. Contribute to improving the QA process. Follow project testing standards. Qualifications Extensive experience in ETL, data warehousing, and BI reporting testing. Proficiency in SQL, Python for automation, and Azure Data Bricks. Strong understanding of relational databases and XML. Experience with test automation, Agile/Waterfall methodologies, and Atlassian tools. Excellent communication and problem-solving skills
Posted 3 weeks ago
2.0 - 4.0 years
6 - 7 Lacs
Mumbai
Work from Office
CRISIL is looking for Database Developer to join our dynamic team and embark on a rewarding career journeyThe developer should be proficient in database design, programming languages, and SQL.The key responsibilities of a Database Developer may include:1.Developing database solutions to store and manage large amounts of data.2.Creating database schemas that represent and support business processes.3.Optimizing database performance by identifying and resolving issues with indexing, query design, and other performance-related factors.4.Developing and maintaining database applications and interfaces that allow users to access and manipulate data.A successful Database Developer should have strong technical skills, including proficiency in database design, programming languages, and SQL. They should have experience working with large and complex data sets and knowledge of relational databases and SQL. The developer should also have experience with database management systems, such as Oracle, MySQL, or SQL Server.
Posted 3 weeks ago
3.0 - 8.0 years
2 - 5 Lacs
Mumbai, Chennai
Hybrid
Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days
Posted 3 weeks ago
8.0 - 10.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Data Governance. Experience : 8-10 Years.
Posted 3 weeks ago
7.0 - 12.0 years
12 - 16 Lacs
Bengaluru
Work from Office
We are looking for lead or principal software engineers to join our Data Cloud team. Our Data Cloud team is responsible for the Zeta Identity Graph platform, which captures billions of behavioural, demographic, environmental, and transactional signals, for people-based marketing. As part of this team, the data engineer will be designing and growing our existing data infrastructure to democratize data access, enable complex data analyses, and automate optimization workflows for business and marketing operations. Job Description: Essential Responsibilities: As a Lead or Principal Data Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as HDFS, Spark, Snowflake, Hive, HBase, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in 24/7 on-call rotation (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 7 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and onpremises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark, HDFS, Hive, HBase Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience with web frameworks such as Flask, Django
Posted 3 weeks ago
2.0 - 7.0 years
7 - 17 Lacs
Bengaluru
Work from Office
In this role, you will: Consult with business line and enterprise functions on less complex research Use functional knowledge to assist in non-model quantitative tools that support strategic decision making Perform analysis of findings and trends using statistical analysis and document process Present recommendations to increase revenue, reduce expense, maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Participate in all group technology efforts including design and implementation of database structures, analytics software, storage, and processing Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff Understand compliance and risk management requirements for supported area Ensure adherence to data management or data governance regulations and policies Participate in company initiatives or processes to assist in meeting risk and capital objectives and other strategic goals Collaborate and consult with more experienced consultants and with partners in technology and other business groups Required Qualifications: 2+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Bachelors degree or higher in a quantitative field such as computer sciences, engineering, applied math, accounting, finance, economics, econometrics with a quantitative emphasis. Experience in one or a combination of the following: Data quality, reporting, analytics. Prior experience in an internal/external consultative role and/or investment banking/corporate banking support experience is preferred. Prior experience in handling small or medium complexity projects individually, complex or large-scale project participation, experience with matrix managed teams is a plus Candidates should have a solid mastery of Advanced SQL, Advanced Excel, Power BI, knowledge in performing ETL tasks using SQL Python, Microsoft PowerPoint skills The successful candidate will have a very strong interest in developing both business and technical skills while navigating a dynamic business environment. Excellent verbal, written, and interpersonal communication skills Strong analytical skills with high attention to detail and accuracy Experience with and knowledge of relational databases and concepts Ability to interact and build relationships with senior leaders, peers and key support partners. Ability to work creatively, analytically and independently in a dynamic environment. Exceptional oral and written communication skills and ability to communicate effectively with both business unit (non-technical) and development (technical) personnel. Experience documenting processes and reporting workflows Job Expectations: Strong Individual contributor with excellent communication skills for a variety of audiences (other technical staff and senior management) both verbally and in writing Experience in problem analysis, solution implementation, and change management Must make timely and independent judgment decisions while working in a fast-paced and results-driven environment Ability to articulate issues, risks and proposed solutions to various levels of staff and management Capability to multi-task and finish work within strict timelines and provide timely requests for information and follow-up questions Skill in managing relationships with key stakeholders Eagerness to contribute collaboratively on projects and discussions Perpetual interest in learning something new, but being comfortable with not knowing all the answers Attention to detail in both analytics and documentation Connects with customers to understand and document business processes workflow. Develop understanding of business processes and recommend efficiency based on technical and architectural knowledge Provide both technical consulting to business leaders at an appropriate level of information encapsulation Apply critical thinking skills and perform advanced analytics with the goal of solving complex and multi-faceted business problems. Develop reports using Power BI, SQL, Tableau, Python, Excel or various other tools Having SSRS knowledge will be added advantage Verify accuracy of reports, reconciling data and producing data/information within established timelines Generate deep insights through the analysis of data and understanding of operational processes and turn them into actionable recommendations Work with technology partners to managing tables, views and other database structures to adapt to changing business needs and in compliance with embedded IT policies. Answer ad hoc questions from customers including confirming data, populating templates provided to the group or creating new reports/extracts as requested by customers Collaborate with team members to operate at higher level of dimensionality to innovate and bring in game changing ideas to fruition Partner with business stakeholders in the development of key business reporting (e.g. Portfolio level dashboard, Productivity, regulatory reporting etc.) and optimal delivery channels for the distribution of the reporting solutions
Posted 3 weeks ago
5.0 - 10.0 years
20 - 30 Lacs
Delhi / NCR, Bengaluru
Work from Office
Description: Job Title: Senior Python Developer (AWS, SQL, Django/Flask) Experience: 6–8 Years Location: Noida / Bangalore (Hybrid) Notice Period: Immediate Joiners Preferred Requirements: Must-Have Skills: Strong programming experience in Python (6+ years). Hands-on expertise with Django and/or Flask frameworks. Proven experience with AWS services – Lambda, EC2, S3, RDS, etc. Strong understanding of SQL and relational database systems. Experience with RESTful API development and integration. Good understanding of software engineering best practices (CI/CD, version control, testing). Job Responsibilities: Key Responsibilities: Develop, maintain, and optimize scalable backend services using Python with Django and/or Flask frameworks. Design and implement APIs and integration solutions with strong focus on performance and reliability. Work on cloud-based architecture and deployment using AWS (EC2, Lambda, S3, RDS, etc.). Develop robust data models and queries using SQL and work with relational databases like PostgreSQL or MySQL. Participate in code reviews, unit testing, and application monitoring to ensure quality deliverables. Collaborate with cross-functional teams to understand requirements and deliver efficient solutions. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 3 weeks ago
1.0 - 3.0 years
10 - 14 Lacs
Gurugram
Work from Office
About the Role: Grade Level (for internal use): 07 The Team S&P Global Mobility is seeking someone who is self-motivated, passionate about data and automobiles and a willingness to work with a geographically dispersed Operations team. The North American VIN team, part of the Global Data Operations (GDO) team, is responsible for research, analysis, and maintaining over 200 vehicle attributes in the North American market. Our team serves as the first source of truth for data, integrating into multiple S&P Global products. We value collaboration and maintain excellent relationships with OEMs worldwide, striving to provide actionable intelligence to our customers. Responsibilities and Impact As member of our team you will get an opportunity to work on various data tools and applications. The team member will be responsible for: Research, Process, and Maintain Vehicle Data Conduct thorough research and capture, validate & manage data related to over 200 vehicle attributes in the North American market. Data Transformation Run SQL queries to extract data and transform results into Excel files for analysis and coding into our tools and applications. Client Queries and Case Investigation Investigate client queries and Salesforce cases, identify discrepancies in existing coded data, and provide findings to vertical leads. Provide Actionable Intelligence Strive to deliver actionable insights to customers, enhancing the value of S&P Global products. Identify Process Improvements Identify process improvements within products and work to automate existing processes. Gen AI innovation: Working on Gen AI ideas and S&P Global internal Gen AI tool (Spark) to make the existing process more efficient. What Were Looking for Required Qualifications: Educational Background B.Tech (Mechanical with specialization in Automobile Engineering preferred) or any other similar Bachelor/Master degrees. Experience: 1-3 years of experience in data management. Technical Skills Strong knowledge of MS Excel, Word, and PowerPoint. Scripting and Database Knowledge Proficiency in scripting languages such as Python & SQL/PLSQL, along with a solid understanding of relational databases. Attention to Detail High attention to detail and accuracy in data management tasks. Course Knowledge Familiarity with Data Management & Operations, analytics and business intelligence. Problem-Solving Skills Strong analytical thought process and a drive for learning. Communication Skills Strong written and verbal communication skills. Team Collaboration Ability to work effectively with a geographically dispersed Operations team. Preferred Qualifications: SQL, Python and Gen AI expert Automobile sector background About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 20 - Professional (EEO-2 Job Categories-United States of America), DTMGOP203 - Entry Professional (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 3 weeks ago
1.0 - 2.0 years
3 - 4 Lacs
Gurugram, Bengaluru
Work from Office
About the Role: Grade Level (for internal use): 08 S&P Global Mobility The Role: Data Engineer The Team We are the Research and Modeling team, driving innovation by building robust models and tools to support the Vehicle & Powertrain Forecast team. Our work includes all aspects of development of, and ongoing support for, our business line data flows, analyst modelling solutions and forecasts, new apps, new client-facing products, and many other work areas besides. We value ownership, adaptability, and a passion for learning, while fostering an environment where diverse perspectives and mentorship fuel continuous growth. The Impact We areseekinga motivated and talented Data Engineer to be a key player in building a robust data infrastructure and flows that supports our advanced forecasting models. Your initial focus will be to create a robust data factory to ensure smooth collection and refresh of actual data, a critical component that feeds our forecast. Additionally, you will be assisting in developing mathematical models and supporting the work of ML engineers and data scientists. Your work will significantly impact our ability to deliver timely and insightful forecasts to our clients. Whats in it for you: Opportunity to build foundational data infrastructure that directly impacts advanced forecasting models and client delivery. Gain exposure to and support the development of sophisticated mathematical models, Machine Learning, and Data Science applications. Contribute significantly to delivering timely and insightful forecasts, influencing client decisions in the automotive sector. Work in a collaborative environment that fosters continuous learning, mentorship, and professional growth in data engineering and related analytical fields. Responsibilities Data Pipeline Development: Design, build, and maintain scalable and reliable data pipelines for efficient data ingestion, processing, and storage, primarily focusing on creating a data factory for our core forecasting data. Data Quality and Integrity: Implement robust data quality checks and validation processes to ensure the accuracy and consistency of data used in our forecasting models. Mathematical Model Support: Collaborate with other data engineers to develop and refine mathematical logics and models that underpin our forecasting methodologies. ML and Data Science Support Provide data support to our Machine Learning Engineers and Data Scientists. Collaboration and Communication: Work closely with analysts, developers, and other stakeholders to understand data requirements and deliver effective solutions. Innovation and Improvement: Continuously explore and evaluate new technologies and methodologies to enhance our data infrastructure and forecasting capabilities. What Were Looking For: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Minimum of 1 - 2 years of experience in data engineering, with a proven track record of building and maintaining data pipelines. Strong proficiency in SQL and experience with relational and non-relational databases. Strong Python programming skills, with experience in data manipulation and processing libraries (e.g., Pandas, NumPy). Experience with mathematical modelling and supporting ML and data science teams. Experience with cloud platforms (e.g., AWS, Azure, GCP) and cloud-based data services. Strong communication and collaboration skills, with the ability to work effectively in a team environment. Experience in the automotive sector is a plus. Statement: S&P Global delivers essential intelligence that powers decision making. We provide the worlds leading organizations with the right data, connected technologies and expertise they need to move ahead. As part of our team, youll help solve complex challenges that equip businesses, governments and individuals with the knowledge to adapt to a changing economic landscape. S&P Global Mobility turns invaluable insights captured from automotive data to help our clients understand todays market, reach more customers, and shape the future of automotive mobility. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 203 - Entry Professional (EEO Job Group) (inactive), 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH203 - Entry Professional (EEO Job Group)
Posted 3 weeks ago
10.0 - 15.0 years
35 - 40 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: AI Application Integration. Experience: 10 YEARS.
Posted 3 weeks ago
8.0 - 10.0 years
4 - 8 Lacs
Kolkata
Work from Office
1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally
Posted 3 weeks ago
7.0 - 12.0 years
13 - 18 Lacs
Bengaluru
Work from Office
The primary focus at EITSI is to develop the next generation LIMS (Lab Information Management system), Customer portals, e-commerce solutions, ERP/CRM system, Mobile Apps & other B2B platforms for various Eurofins Laboratories and businesses. Job TitleModule Lead - Power BI Full Stack Location: Bangalore, India Experience Required: 7-12 years Employment Type: Full-time : We are seeking an experienced Power BI Lead with solid hands-on experience in requirement gathering , documentation, designing, developing, and maintaining robust BI solutions. The ideal candidate should possess expertise in Power BI , along with deep knowledge in backend development, database management , and data modeling to support end-to-end BI project lifecycles. Key Responsibilities: Develop and maintain Power BI reports, dashboards, and interactive visualizations to meet business requirements. Collaborate with business stakeholders to understand data needs, translating them into technical solutions. Design, implement, and optimize SQL queries, stored procedures, and data pipelines for efficient data retrieval and transformation. Develop and maintain ETL processes to ensure data integrity and accuracy across various data sources. Work with APIs and other third-party integrations to gather and visualize external data within Power BI reports. Ensure the security, scalability, and performance of BI solutions by following best practices in data governance. Conduct data modeling and design complex, multi-source data structures to support reporting needs. Perform data validation and troubleshoot issues to ensure accurate reporting and data representation. Continuously work on optimizing Power BI solutions for better performance and user experience . Provide training and technical support to end-users on Power BI tools and features. Required Skills & Qualifications: At least 5 years of hands-on experience in Power BI development, including creating dashboards, visualizations, and reports. Proficiency in DAX and Power Query for data transformations. Strong understanding of SQL Server, T-SQL , and other relational databases. Experience with ETL processes and tools like SSIS, Azure Data Factory , or similar. Experience in data modeling and working with large datasets in a business intelligence environment. Hands-on experience in backend development with programming languages like Python, .NET , or JavaScript is a plus. Ability to work with various data sources (SQL, NoSQL, cloud-based sources). Familiarity with Power BI service , including publishing, scheduling, and managing reports. Understanding of cloud technologies like Azure is a plus. Strong analytical and problem-solving skills. Additional Requirements: Demonstrated ability to have successfully completed multiple, complex technical projects. Demonstrates a rational and organized approach to the tasks undertaken and an awareness of the need to achieve quality. Demonstrates high standards of professional behaviour in dealings with clients, colleagues and staff. Strong written communication skills. Is effective and persuasive in both written and oral communication. Experience with gathering end user requirements and writing technical documentation. Time management and multitasking skills to effectively meet deadlines under time-to-market pressure. May require occasional travel. Qualifications Bachelor"™s degree in Computer Science , Information Technology , or a related field. Power BI or relevant Microsoft certifications are preferred Additional Information Prior experience in industrial settings, and especially with Laboratory processes, is a plus.
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Mumbai
Work from Office
> 5+ years of professional experience with SQL, ETL, data modeling, and at least one programming language (e.g., Python, C++, C#, Scala, etc.) 3+ years of professional experience with Enterprise Domains Like HR, Finance, Supply Chain 4+ years of professional experience with more than one SQL and relational databases including expertise in Presto, Spark, and MySQL Professional experience designing and implementing real-time pipelines (Apache Kafka, or similar technologies) 5+ years of professional experience in custom ETL design, implementation, and maintenance 3+ years of professional experience with Data Modeling including expertise in Data Warehouse design and Dimensional Modeling 5+ years of professional experience working with cloud or on-premises Big Data/MPP analytics platform (Teradata, AWS Redshift, Google BigQuery, Azure Synapse Analytics, or similar) Experience with data quality and validation (using Apache Airflow) Experience with anomaly/outlier detection Experience with Data Science workflow (Jupyter Notebooks, Bento, or similar tools) Experience with Airflow or similar workflow management systems Experience querying massive datasets using Spark, Presto, Hive, or similar Experience building systems integrations, tooling interfaces, implementing integrations for ERP systems (Oracle, SAP, Salesforce, etc.) Experience in data visualizations using Power BI and Tableau. Proficiency in Python programming language and Python libraries, with a focus on data engineering and data science applications. Professional fluency in English required Mandatory Skills: Data Analysis. Experience: 3-5 Years. >
Posted 3 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
> Long Description Bachelors Degree preferred, or equivalent combination of education, training, and experience. 5+ years of professional experience with SQL, ETL, data modeling, and at least one programming language (e.g., Python, C++, C#, Scala, etc.) 3+ years of professional experience with Enterprise Domains Like HR, Finance, Supply Chain 6+ years of professional experience with more than one SQL and relational databases including expertise in Presto, Spark, and MySQL Professional experience designing and implementing real-time pipelines (Apache Kafka, or similar technologies) 5+ years of professional experience in custom ETL design, implementation, and maintenance 3+ years of professional experience with Data Modeling including expertise in Data Warehouse design and Dimensional Modeling 5+ years of professional experience working with cloud or on-premises Big Data/MPP analytics platform (Teradata, AWS Redshift, Google BigQuery, Azure Synapse Analytics, or similar) Experience with data quality and validation (using Apache Airflow) Experience with anomaly/outlier detection Experience with Data Science workflow (Jupyter Notebooks, Bento, or similar tools) Experience with Airflow or similar workflow management systems Experience querying massive datasets using Spark, Presto, Hive, or similar Experience building systems integrations, tooling interfaces, implementing integrations for ERP systems (Oracle, SAP, Salesforce, etc.) Experience in data visualizations using Power BI and Tableau. Proficiency in Python programming language and Python libraries, with a focus on data engineering and data science applications. Professional fluency in English required Mandatory Skills: Data Analysis. Experience: 5-8 Years. >
Posted 3 weeks ago
3.0 - 8.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Description: Degree and Qualification: BE/B.Tech, ME/M.Tech in CSE/IT, Statistics, or a related field. Masters degree in data science, AI, or a related field is preferred. Number of Years of Experience as a Data Analyst / Scientist3-8 years Language Skills: Good communication skills in English, proficiency in German is an added advantage. Domain Knowledge: Strong understanding of supply chain and supplier performance evaluation processes. Familiarity with procurement, supplier management, inbound processes, and logistics concepts like goods receipt, delivery note, and part numbers. Basic knowledge of plant logistics and operational efficiency. Technical Skills: Python, PySpark, SQL, Power BI (Advanced Level), Databricks, Azure/AWS, Machine Learning, Data Visualization, TensorFlow, PyTorch, and deploying ML models in production. Digital Expertise: 1. Power BI Desktop (advanced): o Advanced knowledge of building dashboards, creating data models, and writing DAX expressions, o Ability to develop custom visualizations in Power BI using Python scripts / other suitable methods to create charts and data representations not supported natively by Power BI o Ability to write Python scripts to process and transform data within Power Query for advanced analytics and visualization scenarios. o Strong design / Power BI UI/ UX skills to ensure Power BI dashboards aligns with business objectives and effectively tells a data-driven story. o Power BI Service (advanced):Experience in publishing, refreshing, and managing reports in Power BI Service, including RLS and data gateways. 2. Machine Learning & AI: o Experience in building predictive models using machine learning algorithms such as regression, classification, clustering, and anomaly detection. o Familiarity with AI concepts like neural networks, NLP, or reinforcement learning. 3. Data Wrangling & Analysis: o Proficiency in Python and popular data libraries like Pandas, NumPy, and Scikit-learn. o Experience in PySpark for distributed data processing and large-scale data transformation. o Strong SQL skills for querying relational databases, optimizing queries, and handling complex datasets. 4. Communication & Stakeholder Management: o Ability to effectively communicate technical insights to non-technical stakeholders. o Strong customer-facing communication skills and experience collaborating with cross-functional teams. Behavioral / Personal Skills: Willingness to learn and apply new skills. High adaptability and readiness to handle unstructured tasks. Strong analytical mindset, with a focus on problem-solving and result-oriented thinking. Team player with excellent communication and interpersonal skills. Job LocationBengaluru Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. - Grade Specific Has more than a year of relevant work experience. Solid understanding of programming concepts, software design and software development principles. Consistently works to direction with minimal supervision, producing accurate and reliable results. Individuals are expected to be able to work on a range of tasks and problems, demonstrating their ability to apply their skills and knowledge. Organises own time to deliver against tasks set by others with a mid term horizon. Works co-operatively with others to achieve team goals and has a direct and positive impact on project performance and make decisions based on their understanding of the situation, not just the rules.
Posted 3 weeks ago
4.0 - 9.0 years
4 - 8 Lacs
Gurugram
Work from Office
Were in search of a Java Developer who lives, eats and breathes programming to lend their support to our talented IT team. Youll be developing and designing high-performance, scalable mission critical technology application systems in line with our company mandate, using Java/JavaEE throughout the program development cycle, from concept to testing and deployment. We also need you to analyze requirements, detail user assistance materials, provide technical documentation, analyze application functionality and offer solutions to any problems that may arise throughout program development. Of course, well want you to be a team player, but you get bonus points if youre looking for a a java developer role that helps you grow and evolve your development skills. Skillset Required Bachelors Degree in Computer Science or an associated field such as Engineering; Masters degree a plus. Oracle Certified Associate (OCA), Oracle Certified Professional (OCP), Oracle Certified Expert (OCE) or Oracle Certified Master (OCM) certification levels are a plus. Expert knowledge of Java and J2EE, including Classloading, Transaction Management and Memory Management. Experience in SQL, Relational Databases, HTML and ORM technology, such as JPA2 and Hibernate. Four years experience in a Java Developer (or related) role, with one to three years of developing apps and other software. Experience in database management, computer architecture, and crafting statistical analysis. Experience working with web frameworks such as Spring Framework, JSF, GWT or Wicket. Experience with testing and deployment, with an attention to detail that supports the software development cycle. Excellent communication and organizational skills, with a drive to hit targets and solve problems along the way. Capable of working as part of a software development and IT team, and with little to no supervision as required.
Posted 4 weeks ago
8.0 - 13.0 years
8 - 12 Lacs
Noida, Gurugram, Uttar Pradesh
Work from Office
We're seeking a talented and highly motivated software engineer to help us develop a scalable, high-performance, cloud-based platform for large-scale data storage and processing. Solve interesting technical challenges in the areas of distributed high-performance computing for a high-available cloud environment. Candidate is hands-on and passionate about exploiting multiple languages and programming techniques across products, frameworks and API layers using the right tool for the right job to address sustainable solutions. Candidate is willing to explore new tools & technologies to meet the product demands. This person will work closely with existing team members to develop a comprehensive Java/J2EE based product. The role requires tight collaboration with product managers and business analysts to develop the products according to the business schedule. General and deep experience with Core Java concepts and J2EE technologies are a must. Strong knowledge of relational database, AWS knowledge is must. This position will suit candidates who enjoy both the technical and business aspects of developing software solutions to a schedule in an environment of high visibility and transparency around deliverables, business needs, and customer value. Key Responsibilities: Implementation of financial services software using enterprise Java, RDBMS and modern web technologies Work closely with product leads to understand development requirements and translate them to code deliverables for financial applications Quickly understand system architecture and leverage design and development, taking ownership of assigned modules to drive projects to completion Independently execute Proof of Concepts to validate approach. Summarize and document results for stakeholder review Validate developed solutions to ensure that requirements are met and the results meet the business needs Establish and maintain Continuous Deployment methodologies including working with SQA teams to enforce unit and automated testing Develop required tools to automate management of all facets of data operations Required Skills: Experience in Core JAVA/J2EE related product development. Excellent knowledge of RDBMS and proficient in PL/SQL is must have. Knowledge of Spring/Hibernate/Restful Web Services is a must. Knowledge of web technologies and JavaScript based frameworks (Node JS, Angular JS etc.) is a plus. The right candidate would also demonstrate solid OO programming including Object Oriented Design Patterns and have strong opinions on best programming practices Experience on some of the cloud technologies like AWS, Container Platform, Container orchestration platform, ECS etc. Well versed with continuous integration and continuous delivery tools and techniques Experience on Oracle 11 or SQL Server Strong proficiency applying REST-based API frameworks to large scale, distributed high traffic web services Experience in Agile SCRUM project management methodologies Prefer to work in a nimble and dynamic environment with strong emphasis on ownership and responsibility Ability and passion to pick up new technologies and stay on the leading edge of full-stack development Education and Experience: Masters or Bachelors in Computer Science, Engineering or equivalent experience 8+ years of professional programming experience Skills Appreciated: Experience with Capital Markets domain Full stack experience is a plus AWS Cloud experience is desirable Experience in Agile SCRUM project methodology
Posted 4 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Work from Office
What You'll Do Avalara is an AI-first company. We expect every engineer, manager, and leader to actively leverage AI to enhance productivity, quality, innovation, and customer value. AI is embedded in our workflows, decision-making, and products and success at Avalara requires embracing AI as an essential capability, not an optional tool. Join a collaborative integration team that values clean architecture, reusable components, and operational excellence. You'll work on high-impact integrations that connect Avalara's business and product systemsensuring automation at scale and supporting mission-critical operations. As part of the Integration Platform Service team within Business Technology, your work will directly support Avalara's growth and operational efficiency. What Your Responsibilities Will Be You'll develop, maintain, and troubleshoot Avalara's integration ecosystem using Boomi, handling everything from new features and enhancements to issue resolution. You'll build scalable, resilient integrations across platforms like Salesforce, NetSuite, Workday, and Marketo using the best tools and techniques. Collaborating with stakeholders from various teams, you'll translate business needs into elegant integration solutions. You'll also contribute to Agile ceremonies as part of our scrum-based development process. You will report to Manager, Business Integrations. What You'll Need to be Successful 3+ years of hands-on experience developing integrations with Boomi Proven ability to integrate cloud-based business systems such as Salesforce and NetSuite Solid understanding of REST APIs and API lifecycle management Proficiency in a programming language such as JavaScript, Java, or Python Strong knowledge of SQL and relational database best practices Experience with cloud infrastructure platforms like AWS or Google Cloud A passion for learning and exploring new technologiesincluding AI-driven tools
Posted 4 weeks ago
8.0 - 11.0 years
16 - 20 Lacs
Hyderabad
Remote
US Shift(Night Shift) 8+ yrs in Data Modeling, 3+ yrs in ER Studio (ERwin not preferred), strong in relational & dimensional modeling, normalization. HR & EPM experience is a plus. Skilled in metadata, data dictionaries, documentation, communication.
Posted 4 weeks ago
21.0 - 31.0 years
32 - 42 Lacs
Bengaluru
Work from Office
What we’re looking for We’re looking for a Senior Software Engineer to be part of our Integrations & Connect engineering team in India. We are looking for a talented and experienced engineer who has a passion for solving challenging technical problems. We work closely with our product owners and our customers, iterating quickly to ensure we build the best solution for our users. We welcome new ideas and fresh perspectives which can help grow our team and our product. We are looking for someone who has a passion for solving complex and interesting problems, and delivering versatile full-stack functionality to support our next set of integrations to drive customer adoption and make the experience more meaningful for our customers. What you’ll be working on Building and maintaining our enterprise integration services / capabilities across multiple data-regions Top-to-bottom ownership of new features, including crafting technical specs, writing readable and extendible code, and keeping tabs on post-release metrics Building observable systems that track important metrics and automatically notifies when something is off Participating in code reviews to validate best practices and logical designs Investigating and addressing issues with performance, scalability, and maintainability on both production and development environments Mentoring and guiding engineers on best practices We’d love to hear from people with 8+ years of experience with backend web application development and integrations Experience with Python and frameworks like FastAPI, Flask, Pyramid Expertise in designing and building world-class services and APIs Able to write efficient SQL queries and design schemas for relational databases (MSSQL or MySQL experience is a plus) Culture of code reviews, writing tech specs, and collaborating closely with other people Ability to work on in agile environments with frequent deployments Practices automated testing, believes and enforces good code quality and best engineering practices Excellent communication skills and the ability to work with both co-located and remote engineers and cross-functional partners SurveyMonkey believes in-person collaboration is valuable for building relationships, fostering community, and enhancing our speed and execution in problem-solving and decision-making. As such, this opportunity is hybrid and requires you to work from the SurveyMonkey office in Bengaluru 3 days per week. #LI - Hybrid
Posted 4 weeks ago
10.0 - 14.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior ETL & Data Streaming Engineer at DataFlow Group, you will have the opportunity to utilize your extensive expertise in designing, developing, and maintaining robust data pipelines. With over 10 years of experience in the field, you will play a pivotal role in ensuring the scalability, fault-tolerance, and performance of our ETL processes. Your responsibilities will include architecting and building both batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka, or AWS Kinesis. You will collaborate closely with data architects, data scientists, and business stakeholders to translate data requirements into efficient pipeline solutions and ensure data quality, integrity, and security across all storage solutions. In addition to monitoring, troubleshooting, and optimizing existing data pipelines, you will also be responsible for developing and maintaining comprehensive documentation for all ETL and streaming processes. Your role will involve implementing data governance policies and best practices within the Data Lake and Data Warehouse environments, as well as mentoring junior engineers to foster a culture of technical excellence and continuous improvement. To excel in this role, you should have a strong background in data engineering, with a focus on ETL, ELT, and data pipeline development. Your deep expertise in ETL tools, data streaming technologies, and AWS data services will be essential for success. Proficiency in SQL and at least one scripting language for data manipulation, along with strong database skills, will also be valuable assets in this position. If you are a proactive problem-solver with excellent analytical skills and strong communication abilities, this role offers you the opportunity to stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Join us at DataFlow Group and be part of a team dedicated to making informed, cost-effective decisions through cutting-edge data solutions.,
Posted 4 weeks ago
3.0 - 5.0 years
7 - 11 Lacs
Gurugram
Work from Office
Job Summary Synechron is seeking a detail-oriented Data Analyst to leverage advanced data analysis, visualization, and insights to support our business objectives. The ideal candidate will have a strong background in creating interactive dashboards, performing complex data manipulations using SQL and Python, and automating workflows to drive efficiency. Familiarity with cloud platforms such as AWS is a plus, enabling optimization of data storage and processing solutions. This role will enable data-driven decision-making across teams, contributing to strategic growth and operational excellence. Software Requirements Required: PowerBI (or equivalent visualization tools like Streamlit, Dash) SQL (for data extraction, manipulation, and querying) Python (for scripting, automation, and advanced analysis) Data management tools compatible with cloud platforms (e.g., AWS S3, Redshift, or similar) Preferred: Cloud platform familiarity, especially AWS services related to data storage and processing Knowledge of other visualization platforms (Tableau, Looker) Familiarity with source control systems (e.g., Git) Overall Responsibilities Develop, redesign, and maintain interactive dashboards and visualization tools to provide actionable insights. Perform complex data analysis, transformations, and validation using SQL and Python. Automate data workflows, reporting, and visualizations to streamline processes. Collaborate with business teams to understand data needs and translate them into effective visual and analytical solutions. Support data extraction, cleaning, and validation from various sources, ensuring data accuracy. Maintain and enhance understanding of cloud environments, especially AWS, to optimize data storage, processing pipelines, and scalability. Document technical procedures and contribute to best practices for data management and reporting. Performance Outcomes: Timely, accurate, and insightful dashboards and reports. Increased automation reducing manual effort. Clear communication of insights and data-driven recommendations to stakeholders. Technical Skills (By Category) Programming Languages: Essential: SQL, Python Preferred: R, additional scripting languages Databases/Data Management: Essential: Relational databases (SQL Server, MySQL, Oracle) Preferred: NoSQL databases like MongoDB, cloud data warehouses (AWS Redshift, Snowflake) Cloud Technologies: Essential: Basic understanding of AWS cloud services (S3, EC2, RDS) Preferred: Experience with cloud-native data solutions and deployment Frameworks and Libraries: Python: Pandas, NumPy, Matplotlib, Seaborn, Plotly, Streamlit, Dash Visualization: PowerBI, Tableau (preferred) Development Tools and Methodologies: Version control: Git Automation tools for workflows and reporting Familiarity with Agile methodologies Security Protocols: Awareness of data security best practices and compliance standards in cloud environments Experience Requirements 3-5 years of experience in data analysis, visualization, or related data roles. Proven ability to deliver insightful dashboards, reports, and analysis. Experience working across teams and communicating complex insights clearly. Knowledge of cloud environments like AWS or other cloud providers is desirable. Experience in a business environment, not necessarily as a full-time developer, but as an analytical influencer. Day-to-Day Activities Collaborate with stakeholders to gather requirements and define data visualization strategies. Design and maintain dashboards using PowerBI, Streamlit, Dash, or similar tools. Extract, transform, and analyze data using SQL and Python scripts. Automate recurring workflows and report generation to improve operational efficiencies. Troubleshoot data issues and derive insights to support decision-making. Monitor and optimize cloud data storage and processing pipelines. Present findings to business units, translating technical outputs into actionable recommendations. Qualifications Bachelors degree in Computer Science, Data Science, Statistics, or related field. Masters degree is a plus. Relevant certifications (e.g., PowerBI, AWS Data Analytics) are advantageous. Demonstrated experience with data visualization and scripting tools. Continuous learning mindset to stay updated on new data analysis trends and cloud innovations. Professional Competencies Strong analytical and problem-solving skills. Effective communication, with the ability to explain complex insights clearly. Collaborative team player with stakeholder management skills. Adaptability to rapidly changing data or project environments. Innovative mindset to suggest and implement data-driven solutions. Organized, self-motivated, and capable of managing multiple priorities efficiently. S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |