Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
9.0 - 12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview As a part of Global Risk Analytics, Enterprise Risk Analytics (ERA ) is responsible for the development of cross-business holistic analytical models and tools. Team responsibilities include: Financed Emissions responsible for supporting the calculation of asset level balance sheet Financed Emissions, which are integral to the Bank ’s goal of achieving Net-zero greenhouse gas emissions by 2050. Financial Crimes Modelling & Analytics responsible for enterprise-wide financial crimes and compliance surveillance model development and ongoing monitoring across all lines of business globally. Operational Risk responsible for operational risk loss forecasting and capital model development for CCAR/stress testing and regulatory capita l reporting/economic capital measurement purpose. Business Transformations is a central team of Project Managers and Quantitative S/W engineers partnering with coverage area ERA teams with the end goal of onboarding ERA production processes on GCP/production platforms as well as identify risk/gaps in ERA processes which can be fixed with well-designed and controlled S/W solutions. Trade Surveillance Analytics responsible for modelling and analytics supporting trade surveillance activities within risk. Advanced Analytics responsible for driving research, development, and implementation of new enhanced risk metrics and provide quantitative support for loss forecasting and stress testing requirements, including process improvement and automation Job Description The role will be responsible for independently conducting quantitative analytics and modeling projects Responsibilities Perform model development proof of concept, research model methodology, explore internal & external data sources, design model development data, and develop preliminary model Conduct complex data analytics on modeling data, identify, explain & address data quality issues, apply data exclusions, perform data transformation, and prepare data for model development Analyze portfolio definition, define model boundary, analyze model segmentation, develop Financed Emissions models for different asset classes, analyze and benchmark model results Work with Financed Emissions Data Team & Climate Risk Tech on the production process of model development & implementation data, including support data sourcing efforts, provide data requirements, perform data acceptance testing, etc. Work with Financed Emissions Production & Reporting Team on model implementation, model production run analysis, result analysis & visualization Work with ERA Model Implementation team & GCP Tech on model implementation, including opine on implementation design, provide implementation data model & requirements, perform model implementation result testing, etc. Work with Model Risk Management (MRM) on model reviews and obtain model approvals Work with GEG (Global Environmental Group) and FLU (Front Line Unit) on model requirements gathering & analysis, Climate Risk target setting, disclosure, analysis & reporting Requirements Education B.E. / B. Tech/M.E. /M. Tech Certifications If any : NA Experience Range : 9 to 12 years Foundational Skills* Advanced knowledge of SQL and Python Advanced Excel, VSCode, LaTex, Tableau skills Experience in multiple data environment such as Oracle, Hadoop, and Teradata Knowledge of data architecture concepts, data models, ETL processes Knowledge of climate risk, financial concepts & products Experience in extracting, and combining data across from multiple sources, and aggregate data for model development Experience in conducting quantitative analysis, performing model driven analytics, and developing models Experience in documenting business requirements for data, model, implementation, etc. Desired Skills Basics of Finance Basics of Climate Risk Work Timings 11:30 AM to 8:30 PM Job Location Hyderabad, Chennai Show more Show less
Posted 1 month ago
0.0 - 2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Citi, the leading global bank, has approximately 200 million customer accounts and does business in more than 160 countries and jurisdictions. Citi provides consumers, corporations, governments and institutions with a broad range of financial products and services, including consumer banking and credit, corporate and investment banking, securities brokerage, transaction services, and wealth management. Our core activities are safeguarding assets, lending money, making payments and accessing the capital markets on behalf of our clients. Citi's Mission and Value Proposition explains what we do and Citi Leadership Standards explain how we do it. Our mission is to serve as a trusted partner to our clients by responsibly providing financial services that enable growth and economic progress. We strive to earn and maintain our clients and the publics trust by constantly adhering to the highest ethical standards and making a positive impact on the communities we serve. Our Leadership Standards is a common set of skills and expected behaviors that illustrate how our employees should work every day to be successful and strengthens our ability to execute against our strategic priorities. Diversity is a key business imperative and a source of strength at Citi. We serve clients from every walk of life, every background and every origin. Our goal is to have our workforce reflect this same diversity at all levels. Citi has made it a priority to foster a culture where the best people want to work, where individuals are promoted based on merit, where we value and demand respect for others and where opportunities to develop are widely available to all. The Operations MIS team focuses on creating reports, dashboards, and performance metrics to provide actionable insights for various business functions, including USPB WFM & Customer Service and Wealth Ops MIS. They are responsible for building and maintaining datamarts, migrating legacy BI tools to modern platforms like Tableau, and automating data refreshes for dashboards. Projects include tracking ATM availability and performance, managing service tickets, and upgrading software for uninterrupted service. They aim to empower business stakeholders with accurate and timely information for strategic decision-making. Their work also supports capacity planning and issue remediation efforts. The Data/Information Mgt Analyst - C09 is a developing professional role. Applies specialty area knowledge in monitoring, assessing, analyzing and/or evaluating processes and data. Interprets data and makes recommendations. Researches and interprets information. Identifies inconsistencies in data or results, defines business issues and formulates recommendations on policies, procedures or practices. Integrates established disciplinary knowledge within own specialty area with basic understanding of related industry practices. Good understanding of how the team interacts with others in accomplishing the objectives of the area. Develops working knowledge of industry practices and standards. Limited but direct impact on the business through the quality of the tasks/services provided. Impact of the job holder is restricted to own team. In this role, you're expected to: Gathers operational data from various cross functional stakeholders to examine past business performance Identifies data patterns & trends, and provides insights to enhance business decision making capability in business planning, process improvement, solution assessment etc. Recommends actions for future developments & strategic business opportunities, as well as enhancements to operational policies. May be involved in exploratory data analysis, confirmatory data analysis and/or qualitative analysis. Translate data into consumer or customer behavioral insights to drive targeting and segmentation strategies, and communicate clearly and effectively to business partners and senior leaders all findings Continuously improve processes and strategies by exploring and evaluating new data sources, tools, and capabilities Work closely with internal and external business partners in building, implementing, tracking and improving decision strategies Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. As a successful candidate, should ideally have the following skills and exposure: Data Warehousing & BI Tools: Strong understanding of data warehousing concepts, ETL processes, and experience working with Business Intelligence platforms (e.g., Tableau, Power BI). Reporting & Dashboard Development: Proficiency in developing reports, dashboards, and performance metrics using reporting tools and data visualization techniques. Ability to create clear and concise visualizations of key data. Data Management & SQL: Expertise in data management principles, SQL programming for data extraction and manipulation, and database management systems. Communication & Stakeholder Management: Ability to effectively communicate technical information to non-technical stakeholders and collaborate with business partners to understand reporting requirements. Automation & Scripting: Experience with scripting languages (e.g., Python) and automation tools (e.g., SSIS) for automating report generation, data refreshes, and other routine tasks. Education: Master degree in Information Technology / Information Systems / Computer Applications / Engineering from a premier institute BTech/B.E/MCA in Information Technology / Information Systems / Computer Applications Experience: Established competency in one or more of the following: 0-2 years (for master’s degree) / 2-4 year (for 4 years bachelor’s degree) of relevant work experience in Data Management / MIS / Reporting / Data analytics within Banking / Financial Services / Analytics Industry Programming –SQL Data Manipulation: PySpark, python, SSIS Visualization: Tableau / Power BI Reporting: SSRS Databases: MS SQL Server, Teradata Understanding of Systems and Technology Platforms Strong analytical aptitude and logical reasoning ability Strong communication skills Good presentation skills Working at Citi is far more than just a job. A career with us means joining a family of more than 230,000 dedicated people from around the globe. At Citi, you’ll have the opportunity to grow your career, give back to your community and make a real impact. ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Data/Information Management ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 month ago
6.0 years
5 - 9 Lacs
Hyderābād
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Apply business standards, processes and requirements to set up automated processes for data transformation, loading, and normalization Build, monitor and maintain highly secure digital pipelines for the transport of clinical data for DRN projects Validate data relationships, mappings and definitions Develop methodology to analyze and load data and to ensure data quality Participate in internal and external client data implementation calls to discuss file data formats and content Monitor and maintain extraction processes to ensure that data feeds remain current and complete Monitor data flows for loading errors and incomplete or incompatible formats Coordinate with team members and clients to ensure on-time delivery and receipt of data files Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: College bachelor’s degree, or equivalent work experience 6+ years of Scala or Python or Spark or SQL 6+ months of SAS or SQL programming experience Experience with complex SQL statements and familiarity with relational databases Health care claims data experience Unix scripting knowledge General software development knowledge (.NET, Java, Oracle, Teradata, HTML, etc.) Familiarity with processing large data sets AWS or Azure cloud services exposure (AWS Glue, Azure Synapse, Azure Data Factory, Data Bricks) Proficient with Microsoft operating systems and Internet browsers Proven ability to work independently Proven solid verbal and written communication skills Proven solid ability to multi-task and prioritize multiple projects at any given time Proven solid analytical and problem-solving skills Proven ability to work within a solid team structure Willing or ability to travel to meet with internal or external customers At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 1 month ago
2.0 years
1 - 8 Lacs
Hyderābād
On-site
About this role: Wells Fargo is seeking an Analytics Consultant. In this role, you will: Consult with business line and enterprise functions on less complex research Use functional knowledge to assist in non-model quantitative tools that support strategic decision making Perform analysis of findings and trends using statistical analysis and document process Present recommendations to increase revenue, reduce expense, maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Participate in all group technology efforts including design and implementation of database structures, analytics software, storage, and processing Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff Understand compliance and risk management requirements for supported area Ensure adherence to data management or data governance regulations and policies Participate in company initiatives or processes to assist in meeting risk and capital objectives and other strategic goals Collaborate and consult with more experienced consultants and with partners in technology and other business groups Required Qualifications: 2+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Experience in Analytics, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education. Excellent verbal, written, and interpersonal communication skills. Strong knowledge of Enterprise Risk programs and applicability of risk management framework (3 Line of defense) Experience identifying internal and external data sources from multiple sources across the business Experience with SQL, Teradata, or SAS and Database Management systems like Teradata and MS SQL Server. Experience in risk (includes compliance, financial crimes, operational, audit, legal, credit risk, market risk). Experience in data visualization and business intelligence tools. Advanced Microsoft Office (Word, Excel, Outlook and PowerPoint) skills Demonstrated strong analytical skills with high attention to detail and accuracy. Strong presentation skills and ability to translate and present data in a manner that educates, enhances understanding, and influence decisions, bias for simplicity Strong writing skills - proven ability to translate data sets and conclusions drawn from analysis into business/executive format and language Ability to support multiple projects with tight timelines Meta Data management, Data Lineage, Data Element Mapping, Data Documentation experience. Experience researching and resolving data problems and working with technology teams on remediation of data issues Hands-on proficiency with Python, Power BI (Power Query, DAX, Power apps), Tableau, or SAS Knowledge of Defect management tools like HP ALM. Knowledge of Data Governance. Job Expectations: Ensure adherence to data management or data governance regulations and policies Extract and analyze data from multiple technology systems/platforms and related data sources to identify factors that pose a risk to the firm. Consult with business line and enterprise functions on less complex research Understand compliance and risk management requirements for sanctions compliance and data management Perform analysis of findings and trends using statistical analysis and document process Require a solid background in reporting, understanding and utilizing Relational Databases and Data Warehouses, and be effective in querying and reporting large and complex data sets. Excel at telling stories with data, presenting information in visually compelling ways that appeal to executive audiences, and will be well versed in the development and delivery of reporting solutions. Responsible for building easy to use visualization and perform data analysis to generate meaningful business insights using complex datasets for global stakeholders. Responsible for testing key reports and produce process documentation. Present recommendations to maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff Posting End Date: 16 Jun 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.
Posted 1 month ago
3.0 - 5.0 years
10 - 14 Lacs
Gurugram, Bengaluru, Mumbai (All Areas)
Hybrid
Role & responsibilities : Design,develop, and maintain ETL workflows using Ab Initio. Manage and support critical data pipelines and data sets across complex,high-volume environments. Perform data analysis and troubleshoot issues across Teradata and Oracle data sources. Collaborate with DevOps for CI/CD pipeline integration using Jenkins, and manage deployments in Unix/Linux environments. Participate in Agile ceremonies including stand-ups, sprint planning, and roadmap discussions. Support cloud migration efforts, including potential adoption of Azure,Databricks, and PySparkbased solutions. Contribute to project documentation, metadata management (LDM, PDM), onboarding guides, and SOPs Preferred candidate profile 3 years of experience in data engineering, with proven expertise in ETL development and maintenance. Proficiency with Ab Initio tools (GDE, EME, Control Center). Strong SQL skills, particularly with Oracle or Teradata. Solid experience with Unix/Linux systems and scripting. Familiarity with CI/CD pipelines using Jenkins or similar tools. Strong communication skills and ability to collaborate with cross-functional teams.
Posted 1 month ago
2.0 - 4.0 years
8 - 12 Lacs
Mumbai
Work from Office
The SAS to Databricks Migration Developer will be responsible for migrating existing SAS code, data processes, and workflows to the Databricks platform. This role requires expertise in both SAS and Databricks, with a focus on converting SAS logic into scalable PySpark and Python code. The developer will design, implement, and optimize data pipelines, ensuring seamless integration and functionality within the Databricks environment. Collaboration with various teams is essential to understand data requirements and deliver solutions that meet business needs
Posted 1 month ago
0 years
7 - 9 Lacs
Calcutta
On-site
Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Senior Principal Consultant –Banking SME! Responsibilities 1. Domain Expertise & Requirements Gathering Act as the Subject Matter Expert (SME) for banking products, services, and core operations (e.g., retail banking, lending, cards, payments, risk, compliance). Understand current reporting and data usage in Teradata across key functions (Finance, Risk, Compliance, Treasury). Collaborate with business teams to capture data requirements , KPI definitions , and use cases for cloud consumption. Act as the go-to expert on Indian banking processes, products, and regulations. Lead or contribute to solution design for digital banking platforms, core banking systems, or risk and compliance solutions. Liaise with product managers, developers, and business teams to ensure functional accuracy and feasibility. Conduct in-depth gap analysis, process mapping, and define business requirements. Provide insights on industry trends, regulatory changes, and customer expectations. Assist in responding to RFPs and proposals with domain-specific content. Mentor junior business analysts and support training initiatives. Engage with clients to gather requirements and present domain solutions. 2. Source-to-Target Data Mapping Define and validate data mappings from Teradata tables to GCP data models (e.g., BigQuery). Ensure proper representation of key banking entities such as accounts, transactions, customers, products, GLs. Support creation and validation of STTM (Source-to-Target Mapping) documents and transformation logic. 3. Data Validation & Reconciliation Participate in data validation strategy and run-throughs across multiple reconciliation cycles. Support and perform sample-based and logic-based data validation (e.g., balances, interest accruals, transactional totals). Help establish reconciliation rules to compare GCP output against Teradata extracts or reports. 4. Testing & Sign-Off Define test cases and support User Acceptance Testing (UAT) for migrated data. Review report outputs and regulatory extracts (e.g., BCBS, IFRS, GL reconciliation) for accuracy post-migration. Act as business validator during dry runs and final cutovers. 5. Stakeholder Engagement Liaise between technical migration teams and business stakeholders to clarify business rules and resolve data discrepancies. Conduct walkthroughs and data quality discussions with line-of-business leads and data governance teams. Qualifications we seek in you! Minimum Qualifications / Skills Bachelor’s/ Master’s degree in Finance , Business, or related field. experience in the Indian banking sector, with a mix of operational and digital transformation exposure. Deep understanding of RBI regulations, KYC/AML, Basel II/III, and banking products (CASA, loans, trade finance, etc.). Experience in core banking systems like Finacle, TCS B aNCS, Temenos, or similar. Familiarity with digital banking platforms, APIs, UPI, and open banking. Strong analytical, communication, and stakeholder management skills. Exposure to agile or hybrid project environments preferred. Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Senior Principal Consultant Primary Location India-Kolkata Schedule Full-time Education Level Master's / Equivalent Job Posting Jun 10, 2025, 4:02:24 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 1 month ago
12.0 years
5 - 6 Lacs
Indore
On-site
Indore, Madhya Pradesh, India Qualification : BTech degree in computer science, engineering or related field of study or 12+ years of related work experience 7+ years design & implementation experience with large scale data centric distributed applications Professional experience architecting, operating cloud-based solutions with good understanding of core disciplines like compute, networking, storage, security, databases etc. Good understanding of data engineering concepts like storage, governance, cataloging, data quality, data modeling etc. Good understanding about various architecture patterns like data lake, data lake house, data mesh etc. Good understanding of Data Warehousing concepts, hands-on experience working with tools like Hive, Redshift, Snowflake, Teradata etc. Experience migrating or transforming legacy customer solutions to the cloud. Experience working with services like AWS EMR, Glue, DMS, Kinesis, RDS, Redshift, Dynamo DB, Document DB, SNS, SQS, Lambda, EKS, Data Zone etc. Thorough understanding of Big Data ecosystem technologies like Hadoop, Spark, Hive, HBase etc. and other competent tools and technologies Understanding in designing analytical solutions leveraging AWS cognitive services like Textract, Comprehend, Rekognition etc. in combination with Sagemaker is good to have. Experience working with modern development workflows, such as git, continuous integration/continuous deployment pipelines, static code analysis tooling, infrastructure-as-code, and more. Experience with a programming or scripting language – Python/Java/Scala AWS Professional/Specialty certification or relevant cloud expertise Skills Required : AWS, Big Data, Spark, Technical Architecture Role : Drive innovation within Data Engineering domain by designing reusable and reliable accelerators, blueprints, and libraries. Capable of leading a technology team, inculcating innovative mindset and enable fast paced deliveries. Able to adapt to new technologies, learn quickly, and manage high ambiguity. Ability to work with business stakeholders, attend/drive various architectural, design and status calls with multiple stakeholders. Exhibit good presentation skills with a high degree of comfort speaking with executives, IT Management, and developers. Drive technology/software sales or pre-sales consulting discussions Ensure end-to-end ownership of all tasks being aligned. Ensure high quality software development with complete documentation and traceability. Fulfil organizational responsibilities (sharing knowledge & experience with other teams / groups) Conduct technical training(s)/session(s), write whitepapers/ case studies / blogs etc. Experience : 10 to 18 years Job Reference Number : 12895
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
The Data Engineer will work closely with clients and the eCS Biometrics team to optimize the elluminate® platform for end-to-end solutions to aggregate, transform, access and report on clinical data throughout the life cycle of a clinical trial. This includes study design in elluminate®, collaboration on specifications, and configuration of the various modules to including Data Central, Clinical Data Analytics and Trial Operational Analytics, Risk-Based Quality Management (RBQM), Statistical Computing Environment (SCE) and Operational Insights. The Data Engineer will be involved in standard ETL activities as well as programming custom listings, visualizations and analytics tools using Mapper and Qlik. The position involves a high level of quality control as well as adherence to standard operation procedures and work instructions and a constant drive towards automation and process improvement. Key Tasks & Responsibilities Design, develop, test, and deploy highly efficient code for supporting SDTM, Custom reports and Visualizations using tools like MS SQL, elluminate® Mapper and Qlik Configure ETL processes to support of the aggregation and standardization of clinical data from various sources including EDC systems, SAS and central laboratory vendors Work with Analytics developers, other team members and clients to review the business requirements and translate them into database objects and visualizations Manage multiple timelines and deliverables (for single or multiple clients) and managing client communications as assigned Provide diagnostic support and fix defects as needed Ensure compliance with eClinical Solutions/industry quality standards, regulations, guidelines, and procedures Other duties as assigned CANDIDATE’S PROFILE Education & Experience 3+ years of professional experience preferred Bachelor's degree or equivalent experience preferred Experience with database/warehouse architecture, design and development preferred Knowledge of various data platforms and warehouses including SQL Server, DB2, Teradata, AWS, Azure, Snowflake, etc. Understanding of Cloud / Hybrid data architecture concepts is a plus Knowledge of clinical trial data is a plus - CDISC ODM, SDTM, or ADAM standards Experience in Pharmaceutical/Biotechnology/Life Science industry is a plus Professional Skills Critical thinking, problem solving and strong initiative Communication and task management skills while working with technical and non-technical teams (both internal to eCS and clients) Must be team oriented with strong collaboration, prioritization, and adaptability skills Excellent knowledge of English; verbal and written communication skills with ability to interact with users and clients providing solutions Excited to learn new tools and product modules and adapt to changing technology and requirements Experience in the Life Sciences industry, CRO / Clinical Trial regulated environment preferred Technical Skills Proficient in SQL, T-SQL, PL/SQL programing Experience in Microsoft Office Applications, specifically MS Project and MS Excel Familiarity with multiple Database Platforms: Oracle, SQL Server, Teradata, DB2 Oracle Familiarity with Data Reporting Tools: QlikSense, QlikView, Spotfire, Tableau, JReview, Business Objects, Cognos, MicroStrategy, IBM DataStage, Informatica, Spark or related Familiarity with other languages and concepts: .NET, C#, Python, R, Java, HTML, SSRS, AWS, Azure, Spark, REST APIs, Big Data, ETL, Data Pipelines, Data Modelling, Data Analytics, BI, Data Warehouse, Data Lake or related Show more Show less
Posted 1 month ago
1.0 - 3.0 years
3 - 5 Lacs
New Delhi, Chennai, Bengaluru
Hybrid
Your day at NTT DATA We are seeking an experienced Data Engineer to join our team in delivering cutting-edge Generative AI (GenAI) solutions to clients. The successful candidate will be responsible for designing, developing, and deploying data pipelines and architectures that support the training, fine-tuning, and deployment of LLMs for various industries. This role requires strong technical expertise in data engineering, problem-solving skills, and the ability to work effectively with clients and internal teams. What youll be doing Key Responsibilities: Design, develop, and manage data pipelines and architectures to support GenAI model training, fine-tuning, and deployment Data Ingestion and Integration: Develop data ingestion frameworks to collect data from various sources, transform, and integrate it into a unified data platform for GenAI model training and deployment. GenAI Model Integration: Collaborate with data scientists to integrate GenAI models into production-ready applications, ensuring seamless model deployment, monitoring, and maintenance. Cloud Infrastructure Management: Design, implement, and manage cloud-based data infrastructure (e.g., AWS, GCP, Azure) to support large-scale GenAI workloads, ensuring cost-effectiveness, security, and compliance. Write scalable, readable, and maintainable code using object-oriented programming concepts in languages like Python, and utilize libraries like Hugging Face Transformers, PyTorch, or TensorFlow Performance Optimization: Optimize data pipelines, GenAI model performance, and infrastructure for scalability, efficiency, and cost-effectiveness. Data Security and Compliance: Ensure data security, privacy, and compliance with regulatory requirements (e.g., GDPR, HIPAA) across data pipelines and GenAI applications. Client Collaboration: Collaborate with clients to understand their GenAI needs, design solutions, and deliver high-quality data engineering services. Innovation and R&D: Stay up to date with the latest GenAI trends, technologies, and innovations, applying research and development skills to improve data engineering services. Knowledge Sharing: Share knowledge, best practices, and expertise with team members, contributing to the growth and development of the team. Bachelors degree in computer science, Engineering, or related fields (Masters recommended) Experience with vector databases (e.g., Pinecone, Weaviate, Faiss, Annoy) for efficient similarity search and storage of dense vectors in GenAI applications 5+ years of experience in data engineering, with a strong emphasis on cloud environments (AWS, GCP, Azure, or Cloud Native platforms) Proficiency in programming languages like SQL, Python, and PySpark Strong data architecture, data modeling, and data governance skills Experience with Big Data Platforms (Hadoop, Databricks, Hive, Kafka, Apache Iceberg), Data Warehouses (Teradata, Snowflake, BigQuery), and lakehouses (Delta Lake, Apache Hudi) Knowledge of DevOps practices, including Git workflows and CI/CD pipelines (Azure DevOps, Jenkins, GitHub Actions) Experience with GenAI frameworks and tools (e.g., TensorFlow, PyTorch, Keras) Nice to have: Experience with containerization and orchestration tools like Docker and Kubernetes Integrate vector databases and implement similarity search techniques, with a focus on GraphRAG is a plus Familiarity with API gateway and service mesh architectures Experience with low latency/streaming, batch, and micro-batch processing Familiarity with Linux-based operating systems and REST APIs
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our Company Teradata empowers companies to achieve high-impact business outcomes through analytics. With a powerful combination of Industry expertise and leading hybrid cloud technologies for data warehousing and big data analytics, Teradata unleashes the potential of great companies. Partnering with top companies around the world, Teradata helps improve customer experience, mitigate risk, drive product innovation, achieve operational excellence, transform finance, and optimize assets. Teradata is recognized by media and industry analysts as a future-focused company for its technological excellence, sustainability, ethics, and business value. The Teradata culture isn’t just about one kind of person. So many individuals make up who we are, making us that much more unique. It’s what sets apart the dynamic, diverse and collaborative environment that is Teradata. But even as individuals, there’s one thing that we all share —our united goal of making Teradata and our people, the best we can be. Who You’ll Work With Teradata Labs is where cutting-edge innovations in data management turn into business value. Our outstanding team of database architects and software engineers work together to understand and advance emerging technologies to produce the next wave of big data analytic solutions. Teradata Database is the core of Teradata Massively Parallel Processing (MPP) systems that run on-premises and in hybrid clouds to manage and optimize sophisticated workloads. The heart of Teradata Database is its cloud-based best-in-class query optimization engine. We work on query optimization techniques in database and analytics engines, machine learning algorithms, scalability and elasticity issues in the cloud, and many other exciting challenges related to performance, usability, accessibility and integration. What You’ll Do The Database Query Optimization group at Teradata Labs has an opening for Staff Software Engineer. In this role, you are expected to contribute to the design, development, and testing of new enhancements and advanced features for the Teradata Vantage Core Platform. Responsible for all phases of agile software development life cycle from software design through customer support Candidate should have the skills to research and establish technical direction for complex feature development, and perform functional and performance problem analysis. As needed, candidate must be able to perform competitive analysis of competing database management systems and data integration solutions, and provide recommendations on Teradata offering changes to close competitive gaps and enhance competitive advantages Design, implement, validate, and test new database and novel query optimization features in an Agile form, and perform functional and performance analysis of code defects and correction of the defects Contribute to the delivery and continuous support of robust, resilient, and quality database products Lead and establish technical direction for a group of software engineers during feature development Help feature manager with technical aspects of features and projects including plan, track and provide status on large projects What Makes You a Qualified Candidate Bachelor’s Degree in Computer Science(B. Tech) or related discipline, with at least ten years of related research or industry, or Master’s Degree in Computer Science(M. Tech/MCA) or related discipline, with at least eight years of related research or industry experience, or Ph. D. in Computer Science or related discipline, with at least five years of related research or industry experience Technical leadership in composing very complex and visionary idea in cloud-based data management specifically query processing and optimization What You’ll Bring Familiarity with various database technologies Deep understanding of Amazon Web Services (AWS) / Public Cloud technologies and operations Demonstrated design skills for large scale, elastic and highly available cloud database services or distributed systems Top-notch programming skills in C++, Java, Python, R, SQL Computer Science fundamentals in object-oriented design, design patterns, and test driver development System development experience Debugging with complex software in a parallel processing environment Passionate, self-motivated, risk taker, pro-active, initiative taker, good communicator (written & verbal), creative, and team-oriented Experience using Agile software development methods and tools Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description An ETL Tester (4+ Years Must) is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. These target systems can be On Cloud or on Premise ∙ They work closely with ETL developers, Data engineers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes ∙ Understand Cloud architecture and design test strategies for data moving in and out of Cloud systems Roles And Responsibilities Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle/ SQL Server/ Teradata / Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in Automation testing and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Skills: rally,agile environment,automation testing,data validation,jira,etl and bi,hp alm,etl testing,test strategy,data warehouse,data integration testing,test case creation,python,oracle/ sql server/ teradata / snowflake,sql,data warehouse testing,database knowledge,test data maintenance Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description An ETL Tester (4+ Years Must) is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. These target systems can be On Cloud or on Premise ∙ They work closely with ETL developers, Data engineers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes ∙ Understand Cloud architecture and design test strategies for data moving in and out of Cloud systems Roles And Responsibilities Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle/ SQL Server/ Teradata / Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in Automation testing and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Skills: rally,agile environment,automation testing,data validation,jira,etl and bi,hp alm,etl testing,test strategy,data warehouse,data integration testing,test case creation,python,oracle/ sql server/ teradata / snowflake,sql,data warehouse testing,database knowledge,test data maintenance Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description An ETL Tester (4+ Years Must) is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. These target systems can be On Cloud or on Premise ∙ They work closely with ETL developers, Data engineers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes ∙ Understand Cloud architecture and design test strategies for data moving in and out of Cloud systems Roles And Responsibilities Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle/ SQL Server/ Teradata / Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in Automation testing and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Skills: rally,agile environment,automation testing,data validation,jira,etl and bi,hp alm,etl testing,test strategy,data warehouse,data integration testing,test case creation,python,oracle/ sql server/ teradata / snowflake,sql,data warehouse testing,database knowledge,test data maintenance Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description An ETL Tester (4+ Years Must) is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. These target systems can be On Cloud or on Premise ∙ They work closely with ETL developers, Data engineers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes ∙ Understand Cloud architecture and design test strategies for data moving in and out of Cloud systems Roles And Responsibilities Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle/ SQL Server/ Teradata / Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in Automation testing and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Skills: rally,agile environment,automation testing,data validation,jira,etl and bi,hp alm,etl testing,test strategy,data warehouse,data integration testing,test case creation,python,oracle/ sql server/ teradata / snowflake,sql,data warehouse testing,database knowledge,test data maintenance Show more Show less
Posted 1 month ago
4.0 - 8.0 years
5 - 17 Lacs
Noida, Uttar Pradesh, India
On-site
Position Title- Data Architect / Solution Architect Location: Pan india This position description should represent your role and responsibilities at the time of appointment, however due to the dynamic nature of our business, your job title, key tasks and responsibilities are likely to evolve over time. The flexibility to adapt to any changes should be considered a key requirement of working at TPG Telecom. Role Purpose & Environment In this role you will work hand-in-hand with various technology and business stakeholders to design and build TPGs modern data platform in the cloud and manage the legacy applications. You will provide strategic direction and leadership guidance driving architecture and implementation initiatives leveraging your knowledge and experience in the area. The role also extends into the consumption side of data and will allow you to deliver business intelligence capabilities (including Advanced Analytics) and strategies for information delivery and data exploration to support business objectives and requirements. We are seeking someone with the passion for understanding and leveraging data, with the attitude and behaviour to deliver on commitments and take ownership of data products when required. Key Responsibilities Define and design the overall data architecture, strategy, and data capabilities roadmap that are consistent with our technology direction. Define and design the data platforms, tools and governing process Create, maintain and communicate go-forward strategies for business intelligence capabilities and tools. Responsible and accountable for producing the data solution and data product architecture design ensuring that they are submitted and progress via the prescribed governance process through to approval (ARB) in a timely manner aligned with project prescribed timelines. Define and review data solutions for re-usability, scalability, synergy opportunities and alignment to defined best practice and guidelines Create and evolve data technology roadmap, to align with continuously evolving business needs. Help defining and improving best practices, guidelines, and integration with other enterprise solutions. Participates in planning, dependency identification, and management as well as estimation with Project Managers. Leads Work Breakdown identification and workshops utilising Architecture designs as input Demonstrated grasp of Architecture techniques and ability to work effectively with senior business stakeholders and initiative owners. Act as Technology advisors on data to business leaders and strategic leaders on technology direction. Key Experience, Skills, and Qualifications Domain Expertise 7 years+ of professional experience in data architecture or data engineering role. Demonstrating a high degree of proficiency in designing and developing complex, high quality data solutions according to our architecture governance policies guidelines. Strong experience in developing and maintaining data warehouses (e.g., Redshift, Teradata) Able to work independently and develop the solution architecture according to the business requirements and compliance requirements. Strong Data warehouse development experience using different ETL tools (e.g. SAS DI, Glue, DBT) Experience with data streaming platforms (e.g., Kafka/Kinesis) Familiarity with different operational orchestration platforms (e.g., Airflow, LSF scheduler etc) Experience with data catalogue and data governance tools. Understanding of CLDM, Star Schema, Data Mesh and Data Product concepts Exposure to machine learning, reporting, data sharing, data intensive application-oriented use cases Extensive experience in consulting with business stakeholders and other user groups to deliver both strategic and tactical information management solution. Experience working within matrix structures, with demonstrated ability to broker outcomes effectively and collaboratively with colleagues and peers. Experience on different delivery methodologies (e.g., Waterfall, Agile) Telecommunication Industry experience Bachelor's degree in computer science, computer programming or related field preferred Individual Skills, Mindset & Behaviours Strong communication skills with ability to communicate complex technical concepts in a digestible way Ability to effortlessly switch gears from summary view for leadership to hands-on discussion with practitioners Assertive, with the confidence to be voice of authority what is best for team High-energy and passionate outlook to the role and can influence those around her/him Ability to build a sense of trust and rapport that creates a comfortable, respectful, and effective workplace
Posted 1 month ago
4.0 - 8.0 years
5 - 17 Lacs
Thane, Maharashtra, India
On-site
Position Title- Data Architect / Solution Architect Location: Pan india This position description should represent your role and responsibilities at the time of appointment, however due to the dynamic nature of our business, your job title, key tasks and responsibilities are likely to evolve over time. The flexibility to adapt to any changes should be considered a key requirement of working at TPG Telecom. Role Purpose & Environment In this role you will work hand-in-hand with various technology and business stakeholders to design and build TPGs modern data platform in the cloud and manage the legacy applications. You will provide strategic direction and leadership guidance driving architecture and implementation initiatives leveraging your knowledge and experience in the area. The role also extends into the consumption side of data and will allow you to deliver business intelligence capabilities (including Advanced Analytics) and strategies for information delivery and data exploration to support business objectives and requirements. We are seeking someone with the passion for understanding and leveraging data, with the attitude and behaviour to deliver on commitments and take ownership of data products when required. Key Responsibilities Define and design the overall data architecture, strategy, and data capabilities roadmap that are consistent with our technology direction. Define and design the data platforms, tools and governing process Create, maintain and communicate go-forward strategies for business intelligence capabilities and tools. Responsible and accountable for producing the data solution and data product architecture design ensuring that they are submitted and progress via the prescribed governance process through to approval (ARB) in a timely manner aligned with project prescribed timelines. Define and review data solutions for re-usability, scalability, synergy opportunities and alignment to defined best practice and guidelines Create and evolve data technology roadmap, to align with continuously evolving business needs. Help defining and improving best practices, guidelines, and integration with other enterprise solutions. Participates in planning, dependency identification, and management as well as estimation with Project Managers. Leads Work Breakdown identification and workshops utilising Architecture designs as input Demonstrated grasp of Architecture techniques and ability to work effectively with senior business stakeholders and initiative owners. Act as Technology advisors on data to business leaders and strategic leaders on technology direction. Key Experience, Skills, and Qualifications Domain Expertise 7 years+ of professional experience in data architecture or data engineering role. Demonstrating a high degree of proficiency in designing and developing complex, high quality data solutions according to our architecture governance policies guidelines. Strong experience in developing and maintaining data warehouses (e.g., Redshift, Teradata) Able to work independently and develop the solution architecture according to the business requirements and compliance requirements. Strong Data warehouse development experience using different ETL tools (e.g. SAS DI, Glue, DBT) Experience with data streaming platforms (e.g., Kafka/Kinesis) Familiarity with different operational orchestration platforms (e.g., Airflow, LSF scheduler etc) Experience with data catalogue and data governance tools. Understanding of CLDM, Star Schema, Data Mesh and Data Product concepts Exposure to machine learning, reporting, data sharing, data intensive application-oriented use cases Extensive experience in consulting with business stakeholders and other user groups to deliver both strategic and tactical information management solution. Experience working within matrix structures, with demonstrated ability to broker outcomes effectively and collaboratively with colleagues and peers. Experience on different delivery methodologies (e.g., Waterfall, Agile) Telecommunication Industry experience Bachelor's degree in computer science, computer programming or related field preferred Individual Skills, Mindset & Behaviours Strong communication skills with ability to communicate complex technical concepts in a digestible way Ability to effortlessly switch gears from summary view for leadership to hands-on discussion with practitioners Assertive, with the confidence to be voice of authority what is best for team High-energy and passionate outlook to the role and can influence those around her/him Ability to build a sense of trust and rapport that creates a comfortable, respectful, and effective workplace
Posted 1 month ago
0.0 - 5.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for designing, building and overseeing the deployment and operation of technology architecture, solutions and software to capture, manage, store and utilize structured and unstructured data from internal and external sources. Establishes and builds processes and structures based on business and technical requirements to channel data from multiple inputs, route appropriately and store using any combination of distributed (cloud) structures, local databases, and other applicable storage forms as required. Develops technical tools and programming that leverage artificial intelligence, machine learning and big-data techniques to cleanse, organize and transform data and to maintain, defend and update data structures and integrity on an automated basis. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Work with data modelers/analysts to understand the business problems they are trying to solve then create or augment data assets to feed their analysis. Integrates knowledge of business and functional priorities. Acts as a key contributor in a complex and crucial environment. May lead teams or projects and shares expertise. Job Description Position: Data Engineer 4 Experience: 8 years to 11.5 years Job Location: Chennai Tamil Nadu Job Description: Requirements Databases: Deep knowledge of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra, Couchbase). ( MUST) Big Data Technologies: Experience with, Spark, Kafka, and other big data ecosystem tools. (NICE TO HAVE) Cloud Platforms: Experience with cloud services such as AWS, Azure, or Google Cloud Platform, with a particular focus on data engineering services . ( NICE TO HAVE ) Version Control: Experience with version control systems like Git. ( MUST) CI/CD: Knowledge of CI/CD pipelines for automating development and deployment processes. (MUST) Proficiency in Elasticsearch and experience managing large-scale clusters. ( MUST) Hands-on experience with containerization technologies like Docker and Kubernetes. (MUST docker) Strong programming skills in scripting languages such as Python, Bash, or similar. ( NICE to HAVE) Key Responsibilities Design, develop, and maintain scalable data pipelines and infrastructure. Ensure compliance with security regulations and implement advanced security measures to protect company data. Implement and manage CI/CD pipelines for data applications. Work with containerization technologies (Docker, Kubernetes) to deploy and manage data services. Optimize and manage Elasticsearch clusters for log ingestion , and tools such as Logstash, fluent.d , promtail, used to forward logs to Elastic Istance or other Log ingestion Tool . ( Loki+ Grafana). Collaborate with other departments (e.g., Data Science, IT, DevOps) to integrate data solutions with existing business systems. Optimize the performance of data pipelines and resolve data integrity and quality issues. Document data processes and architectures to ensure transparency and facilitate maintenance. Monitor industry trends and adopt best practices to continuously improve our data engineering solutions. Core Responsibilities Develops data structures and pipelines aligned to established standards and guidelines to organize, collect, standardize and transform data that helps generate insights and address reporting needs. Focuses on ensuring data quality during ingest, processing as well as final load to the target tables. Creates standard ingestion frameworks for structured and unstructured data as well as checking and reporting on the quality of the data being processed. Creates standard methods for end users / downstream applications to consume data including but not limited to database views, extracts and Application Programming Interfaces. Develops and maintains information systems (e.g., data warehouses, data lakes) including data access Application Programming Interfaces. Participates in the implementation of solutions via data architecture, data engineering, or data manipulation on both on-prem platforms like Kubernetes and Teradata as well as Cloud platforms like Databricks. Determines the appropriate storage platform across different on-prem (minIO and Teradata) and Cloud (AWS S3, Redshift) depending on the privacy, access and sensitivity requirements. Understands the data lineage from source to the final semantic layer along with the transformation rules applied to enable faster troubleshooting and impact analysis during changes. Collaborates with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality as well as process optimization. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Develops strategies for data acquisition, archive recovery, and database implementation. Manages data migrations/conversions and troubleshooting data processing issues. Understands the data sensitivity, customer data privacy rules and regulations and applies them consistently in all Information Lifecycle Management activities. Identifies and reacts to system notification and log to ensure quality standards for databases and applications. Solves abstract problems beyond single development language or situation by reusing data file and flags already set. Solves critical issues and shares knowledge such as trends, aggregate, quantity volume regarding specific data sources. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary. Other duties and responsibilities as assigned. Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer:This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 7-10 Years
Posted 1 month ago
0.0 - 5.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Overview Our analysts transform data into meaningful insights that drive strategic decision making. They analyze trends, interpret data, and discover opportunities. Working cross-functionally, they craft narratives from the numbers - directly contributing to our success. Their work influences key business decisions and shape the direction of Comcast. Success Profile What makes a successful Data Engineer 4 at Comcast? Check out these top traits and explore role-specific skills in the job description below. Good Listener Problem Solver Organized Collaborative Perceptive Analytical Benefits We’re proud to offer comprehensive benefits to help support you physically, financially and emotionally through the big milestones and in your everyday life. Paid Time off We know how important it can be to spend time away from work to relax, recover from illness, or take time to care for others needs. Physical Wellbeing We offer a range of benefits and support programs to ensure that you and your loved ones get the care you need. Financial Wellbeing These benefits give you personalized support designed entirely around your unique needs today and for the future. Emotional Wellbeing No matter how you’re feeling or what you’re dealing with, there are benefits to help when you need it, in the way that works for you. Life Events + Family Support Benefits that support you no matter where you are in life’s journey. Data Engineer 4 Location Chennai, India Req ID R412866 Job Type Full Time Category Analytics Date posted 06/10/2025 Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for designing, building and overseeing the deployment and operation of technology architecture, solutions and software to capture, manage, store and utilize structured and unstructured data from internal and external sources. Establishes and builds processes and structures based on business and technical requirements to channel data from multiple inputs, route appropriately and store using any combination of distributed (cloud) structures, local databases, and other applicable storage forms as required. Develops technical tools and programming that leverage artificial intelligence, machine learning and big-data techniques to cleanse, organize and transform data and to maintain, defend and update data structures and integrity on an automated basis. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Work with data modelers/analysts to understand the business problems they are trying to solve then create or augment data assets to feed their analysis. Integrates knowledge of business and functional priorities. Acts as a key contributor in a complex and crucial environment. May lead teams or projects and shares expertise. Job Description Position: Data Engineer 4 Experience: 8 years to 11.5 years Job Location: Chennai Tamil Nadu Job Description: Requirements Databases: Deep knowledge of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra, Couchbase). ( MUST) Big Data Technologies: Experience with, Spark, Kafka, and other big data ecosystem tools. (NICE TO HAVE) Cloud Platforms: Experience with cloud services such as AWS, Azure, or Google Cloud Platform, with a particular focus on data engineering services . ( NICE TO HAVE ) Version Control: Experience with version control systems like Git. ( MUST) CI/CD: Knowledge of CI/CD pipelines for automating development and deployment processes. (MUST) Proficiency in Elasticsearch and experience managing large-scale clusters. ( MUST) Hands-on experience with containerization technologies like Docker and Kubernetes. (MUST docker) Strong programming skills in scripting languages such as Python, Bash, or similar. ( NICE to HAVE) Key Responsibilities Design, develop, and maintain scalable data pipelines and infrastructure. Ensure compliance with security regulations and implement advanced security measures to protect company data. Implement and manage CI/CD pipelines for data applications. Work with containerization technologies (Docker, Kubernetes) to deploy and manage data services. Optimize and manage Elasticsearch clusters for log ingestion , and tools such as Logstash, fluent.d , promtail, used to forward logs to Elastic Istance or other Log ingestion Tool . ( Loki+ Grafana). Collaborate with other departments (e.g., Data Science, IT, DevOps) to integrate data solutions with existing business systems. Optimize the performance of data pipelines and resolve data integrity and quality issues. Document data processes and architectures to ensure transparency and facilitate maintenance. Monitor industry trends and adopt best practices to continuously improve our data engineering solutions. Core Responsibilities Develops data structures and pipelines aligned to established standards and guidelines to organize, collect, standardize and transform data that helps generate insights and address reporting needs. Focuses on ensuring data quality during ingest, processing as well as final load to the target tables. Creates standard ingestion frameworks for structured and unstructured data as well as checking and reporting on the quality of the data being processed. Creates standard methods for end users / downstream applications to consume data including but not limited to database views, extracts and Application Programming Interfaces. Develops and maintains information systems (e.g., data warehouses, data lakes) including data access Application Programming Interfaces. Participates in the implementation of solutions via data architecture, data engineering, or data manipulation on both on-prem platforms like Kubernetes and Teradata as well as Cloud platforms like Databricks. Determines the appropriate storage platform across different on-prem (minIO and Teradata) and Cloud (AWS S3, Redshift) depending on the privacy, access and sensitivity requirements. Understands the data lineage from source to the final semantic layer along with the transformation rules applied to enable faster troubleshooting and impact analysis during changes. Collaborates with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality as well as process optimization. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Develops strategies for data acquisition, archive recovery, and database implementation. Manages data migrations/conversions and troubleshooting data processing issues. Understands the data sensitivity, customer data privacy rules and regulations and applies them consistently in all Information Lifecycle Management activities. Identifies and reacts to system notification and log to ensure quality standards for databases and applications. Solves abstract problems beyond single development language or situation by reusing data file and flags already set. Solves critical issues and shares knowledge such as trends, aggregate, quantity volume regarding specific data sources. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary. Other duties and responsibilities as assigned. Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer:This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 7-10 Years
Posted 1 month ago
15.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description: About us* At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* GF (Global Finance) Global Financial Control India (GFCI) is part of the CFO Global Delivery strategy to provide offshore delivery to Line of Business (LOBs) and Enterprise Finance functions. The capabilities hosted include General Accounting & Reconciliations, Legal Entity Controllership, Corporate Sustainability Controllership, Corporate Controllership, Management Reporting & Analysis, Finance Systems Support, Operational Risk and Controls, Regulatory Reporting and Strategic initiatives. The Financed Emissions Accounting & Reporting team, a part of the Global Financial Control-Corporate Sustainability Controller organization within the CFO Group, plays a critical role in supporting the calculation of asset level balance sheet Financed Emissions, which are integral to the Bank ’s goal of achieving Net-zero greenhouse gas emissions by 2050. Job Description* The role is responsible for building data sourcing process, data research and analytics using available tools, support model input data monitoring and develop necessary data or reporting frameworks to support our approaches to net zero progress alignment, target setting, client engagement and reputational risk review, empowering banking teams to assist clients on net zero financing strategies and specific commercial opportunities. The role will support and partner with business stakeholders in the Enterprise Climate Program Office, Technology, Climate and Credit Risk, the Global Environment Group, Lines of Business, Legal Entity Controllers and Model Risk Management. Additionally, the role will support data governance, lineage, controls by building, improving and executing data processes. Candidate must be able to communicate across technology partners, climate office and the business lines to execute on viable analytical solutions, with a focus on end-user experience and usability. Candidate must be strong in identifying and explaining data quality issues to help achieve successful and validated data for model execution. This individual should feel at ease creating complex SQL queries and extracting large, raw datasets from various sources, merging, and transforming raw data into usable data and analytic structures, and benchmarking results to known. They must feel comfortable with automating repeatable process, generating data insights that are easy for end users to interpret, conduct quantitative analysis, as well as effectively communicate and disseminate findings and data points to stakeholders. They should also understand greenhouse gas accounting frameworks and financed emissions calculations as applied to different sectors and asset classes. The candidate will have experience representing ERA with critical Climate stakeholders across the firm, and should demonstrate capacity for strategic leadership, exercising significant independent judgment and discretion and work towards strategic goals with limited oversight. Responsibilities* Net zero transition planning and execution: Partners with GEG, Program Office and Lines of Business in developing and executing enterprise-wide net zero transition plan and operational roadmap, with a focus on analysis and reporting capabilities, data procurement, liaising with consultants, external data providers, Climate Risk and Technology functions. Data development & Operations: Research on data requirements, produce executive level and detailed level data summary, validate the accuracy, completeness, reasonableness, timeliness on the dataset and develop desktop procedures for BAU operations. Perform data review and test technology implementation for financed emissions deliverables. Execute BAU processes such as new data cycle creation, execute data controls and data quality processes. Produce data summary materials and walk through with leadership team. Data Analytics & Strategy: Analyze the data and provide how granular data movements across history affects the data new results. Find trends of data improvements or areas for improvement. Develops automated data analysis results and answer the common questions to justify the changes in data. Support ad hoc analytics of bank-wide and client net zero commitment implementation, with an initial focus on automation of financed emissions analysis, reporting against PCAF standards and net zero transition preparedness analytics and engagement to enhance strategy for meeting emissions goals for target sectors. Requirements* Education* Bachelor’s degree in data management or analytics, engineering, sustainability, finance or other related field OR master’s degree in data science, earth/climate sciences, engineering, sustainability, natural resource management, environmental economics, finance or other related field Certifications If Any NA Experience Range* Minimum 15+ years in Climate, Financed Emissions, finance, financial reporting Three (3) or more years of experience in statistical and/or data management and analytics and visualization (intersection with financial services strongly preferred) Foundational skills* Deep expertise in SQL, Excel, automation & optimization, and project management Knowledge of data architecture concepts, data models, ETL processes Deep understanding of how data process works and ability to solve dynamically evolving and complex data challenges part of day-to-day activities. Knowledge of data architecture concepts, data models, ETL processes Deep understanding of how data process works and ability to solve dynamically evolving and complex data challenges part of day-to-day activities. Experience in extracting, and combining data across from multiple sources, and aggregate data to support model development. Experience in multiple database environment such as Oracle, Hadoop, and Teradata Deep expertise in SQL, Excel, Python, automation & optimization, and project management Strong technical and visualization skills, with the ability to understand the business goals, needs, and be committed to delivering recommendations that will guide strategic decisions. Knowledge on Alteryx, Tableau, R, (knowledge of NLP, data scraping and generative AI welcome) Strong leadership skills and proven ability in motivating employees and promoting teamwork. Excellent interpersonal, management, and teamwork skills. High level of significant independent decision-making ability. Highly motivated self-starter with excellent time management skills and the ability to effectively manage multiple priorities and timelines. Demonstrated ability to motivate others in a high-stress environment to achieve goal. Ability to effectively communicate and resolve conflicts by both oral and written communication to both internal and external clients. Ability to adapt to a dynamic and evolving work environment. Well-developed analytical and problem-solving skills. Experience and knowledge of the principles and practices of management and employee development. Ability to think critically to solve problems with rational solutions. Ability to react and make decisions quickly under pressure with good judgment. Strong documentation & presentation skills to explain the data analysis in a visual and procedural way based on the audience. Ability to quickly identify risks and determine reasonable solutions. Desired Skills Advanced knowledge of Finance Advanced knowledge of Climate Risk Work Timings* Window 12:30 PM to 9:30 PM (9 hours shift, may require stretch during peak period) Job Location* Mumbai Show more Show less
Posted 1 month ago
2.0 - 5.0 years
2 - 7 Lacs
Bengaluru
Work from Office
What you will do: Plan and release the developments in different environments Carry out prototypes, unit tests and performance tests and make sure that they respect the technical specs Take charge of the production monitoring and the related incidents Maintenance of scripts (Shell and Perl) Prepare technical documentation Automatization and industrialization of: developments, release of technical and functional components in different environments, infrastructure monitoring In charge of the technical analysis and of proposing technical solutions Follow the quality standards imposed by the project Use standard or innovative functionalities of the product in collaboration with the business lines as well as with the IT teams Use Microstrategy on TeraData to ensure optimum performance Use Microstrategy with Big Data technologies Offer Microstrategy expertise throughout developments Code review and ensure that the Microstrategy is well used Analyze and correct the incidents in prod Profile Required: Minimum 2 years experience Knowledge in a Microstrategy BI environment (development) Knowledge of Linux working environment Knowledge on Teradata/PostgreSQL databases Experience working in an Agile environment Team spirit, integrity and autonomy
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Mumbai
Work from Office
Job Description: Business Title Lead Technical Architect Years of Experience > 7 Years Must have skills 1. Database ( SQL server / SnowFlake / Teradata / Redshift / Vertica / Oracle / Big query / Azure DW etc) 2. ETL tool (Talend, Informatica, IICS (Informatica cloud) ) 3. Experience in Cloud computing (one or more of AWS, Azure, GCP) 4. Python, UNIX shell scripting, Project & resource management 5. SVN, JIRA, Automation workflow (Apache Airflow, Tidal, Tivoli or similar) Good to have skills 1. PySpark, Big Query, Familiar with NoSQL such as MongoDB etc 2. Client-facing skills Jod Descreption The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day to day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after sales support and best practise advice. Interactions with internal stakeholder s and clients to explain technology solutions and a clear understanding of client s business requirements through which to guide optimal design to meet their needs. Key responsibiltes Ability to design simple to medium data solutions for clients by using cloud architecture using AWS/GCP Strong understanding of DW, data mart, data modelling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modelling, data structures, databases, and ETL processes Strong understand of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain detailed design to team and create low level to high level design Create technical documents for ETL and SQL developments using Visio, PowerPoint and other MS Office package Will need to engage with Project Managers, Business Analysts and Application DBA to implement ETL Solutions Perform mid to complex level tasks independently Support Client, Data Scientists and Analytical Consultants working on marketing solution Work with cross functional internal team and external clients Strong project Management and organization skills. Ability to lead 1 - 2 projects of team size 2 - 3 team members. Code management systems which includes Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/ validate proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team Education Qulification 1. Bachelor s or Master Degree in Computer Science Shift timingGMT (UK Shift) - 2 PM to 11 PM Location: Mumbai Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Mumbai
Work from Office
Job Description: Business Title Lead Technical Architect Years of Experience > 7 Years Must have skills 1. Database ( SQL server / SnowFlake / Teradata / Redshift / Vertica / Oracle / Big query / Azure DW etc) 2. ETL tool (Talend, Informatica, IICS (Informatica cloud) ) 3. Experience in Cloud computing (one or more of AWS, Azure, GCP) 4. Python, UNIX shell scripting, Project & resource management 5. SVN, JIRA, Automation workflow (Apache Airflow, Tidal, Tivoli or similar) Good to have skills 1. PySpark, Big Query, Familiar with NoSQL such as MongoDB etc 2. Client-facing skills Job Descreption The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day to day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after sales support and best practise advice. Interactions with internal stakeholder s and clients to explain technology solutions and a clear understanding of client s business requirements through which to guide optimal design to meet their needs. Key responsibiltes Ability to design simple to medium data solutions for clients by using cloud architecture using AWS/GCP Strong understanding of DW, data mart, data modelling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modelling, data structures, databases, and ETL processes Strong understand of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain detailed design to team and create low level to high level design Create technical documents for ETL and SQL developments using Visio, PowerPoint and other MS Office package Will need to engage with Project Managers, Business Analysts and Application DBA to implement ETL Solutions Perform mid to complex level tasks independently Support Client, Data Scientists and Analytical Consultants working on marketing solution Work with cross functional internal team and external clients Strong project Management and organization skills. Ability to lead 1 - 2 projects of team size 2 - 3 team members. Code management systems which includes Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/ validate proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team Education Qualification Bachelor s or Master Degree in Computer Science Shift timing GMT (UK Shift) - 2 PM to 11 PM Location: Mumbai Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 month ago
9.0 - 14.0 years
20 - 25 Lacs
Mumbai
Work from Office
Job Description: Business Title Lead Technical Architect Years of Experience > 7 Years Must have skills 1. Database ( SQL server / SnowFlake / Teradata / Redshift / Vertica / Oracle / Big query / Azure DW etc) 2. ETL tool (Talend, Informatica, IICS (Informatica cloud) ) 3. Experience in Cloud computing (one or more of AWS, Azure, GCP) 4. Python, UNIX shell scripting, Project & resource management 5. SVN, JIRA, Automation workflow (Apache Airflow, Tidal, Tivoli or similar) Good to have skills 1. PySpark, Big Query, Familiar with NoSQL such as MongoDB etc 2. Client-facing skills Jod Descreption The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day to day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after sales support and best practise advice. Interactions with internal stakeholder s and clients to explain technology solutions and a clear understanding of client s business requirements through which to guide optimal design to meet their needs. Key responsibiltes Ability to design simple to medium data solutions for clients by using cloud architecture using AWS/GCP Strong understanding of DW, data mart, data modelling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modelling, data structures, databases, and ETL processes Strong understand of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain detailed design to team and create low level to high level design Create technical documents for ETL and SQL developments using Visio, PowerPoint and other MS Office package Will need to engage with Project Managers, Business Analysts and Application DBA to implement ETL Solutions Perform mid to complex level tasks independently Support Client, Data Scientists and Analytical Consultants working on marketing solution Work with cross functional internal team and external clients Strong project Management and organization skills. Ability to lead 1 - 2 projects of team size 2 - 3 team members. Code management systems which includes Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/ validate proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team Education Qulification 1. Bachelor s or Master Degree in Computer Science Shift timingGMT (UK Shift) - 2 PM to 11 PM Location: Mumbai Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 month ago
5.0 - 9.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes Optimize data flow and collection for cross-functional teams Build infrastructure required for optimal extraction, transformation, and loading of data Ensure data quality, reliability, and integrity across all data systems Collaborate with data scientists and analysts to help implement models and algorithms Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, etc. Create and maintain comprehensive technical documentation Evaluate and integrate new data management technologies and tools Requirements 3-5 years of professional experience in data engineering roles Bachelors degree in Computer Science, Engineering, or related field; Masters degree preferred Job Description Expert knowledge of SQL and experience with relational databases (e.g., PostgreSQL, Redshift, TIDB, MySQL, Oracle, Teradata) Extensive experience with big data technologies (e.g., Hadoop, Spark, Hive, Flink) Proficiency in at least one programming language such as Python, Java, or Scala Experience with data modeling, data warehousing, and building ETL pipelines Strong knowledge of data pipeline and workflow management tools (e.g., Airflow, Luigi, NiFi) Experience with cloud platforms (AWS, Azure, or GCP) and their data services. AWS Preferred Hands on Experience with building streaming pipelines with flink, Kafka, Kinesis. Flink Understanding of data governance and data security principles Experience with version control systems (e.g., Git) and CI/CD practices Preferred Skills Experience with containerization and orchestration tools (Docker, Kubernetes) Basic knowledge of machine learning workflows and MLOps Experience with NoSQL databases (MongoDB, Cassandra, etc.) Familiarity with data visualization tools (Tableau, Power BI, etc.) Experience with real-time data processing Knowledge of data governance frameworks and compliance requirements (GDPR, CCPA, etc.) Experience with infrastructure-as-code tools (Terraform, CloudFormation) Personal Qualities Strong problem-solving skills and attention to detail Excellent communication skills, both written and verbal Ability to work independently and as part of a team Proactive approach to identifying and solving problems
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi