Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 years
0 Lacs
Hyderābād
On-site
Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Senior Principal Consultant - Teradata SME We are seeking a highly experienced and knowledgeable Teradata Subject Matter Expert (SME) to provide deep technical expertise and strategic guidance on our existing Teradata data warehouse environment, with a focus on its integration, migration, and potential modernization within the Google Cloud Platform (GCP). You will be the go-to person for complex Teradata-related challenges, optimization initiatives, and architectural decisions, particularly as they relate to our cloud strategy on GCP. You will collaborate with data engineers, cloud architects, analysts, and business stakeholders to ensure our data landscape effectively leverages both Teradata and GCP capabilities. Responsibilities Serve as the primary point of contact and expert resource for all Teradata-related technical inquiries and issues, including those related to GCP integration. Provide deep technical expertise in Teradata architecture, utilities, performance tuning, and query optimization, with an understanding of how these aspects translate to or interact with GCP services. Lead efforts to integrate Teradata with GCP services for data ingestion, processing, and analysis. Provide guidance and expertise on potential migration strategies from Teradata to GCP data warehousing solutions like BigQuery . Optimize Teradata performance in the context of data pipelines that may involve GCP components. Troubleshoot and resolve complex Teradata system and application issues, considering potential interactions with GCP. Develop and maintain best practices, standards, and documentation for Teradata development and administration, with a focus on cloud integration scenarios. Collaborate with cloud architects and data engineers to design hybrid data solutions leveraging both Teradata and GCP. Provide guidance and mentorship to team members on Teradata best practices and techniques within a cloud-focused context. Participate in capacity planning and forecasting for the Teradata environment, considering its future within our GCP strategy. Evaluate and recommend Teradata upgrades, patches, and new features, assessing their compatibility and value within a GCP ecosystem. Ensure adherence to data governance policies and security standards across both Teradata and GCP environments. Stay current with the latest Teradata features, trends, and best practices, as well as relevant GCP data warehousing and integration services. Qualifications we seek in you! Minimum Qualifications / Skills Bachelor's or Master's degree in Computer Science , Engineering, or a related field. Extensive and deep experience (typically 8+ years) working with Teradata data warehouse systems. Expert-level knowledge of Teradata architecture, including MPP concepts, BYNET, and storage management. Proven ability to write and optimize complex SQL queries in Teradata. Strong experience with Teradata utilities (e.g., BTEQ, FastLoad , MultiLoad , TPump). Deep understanding of Teradata performance tuning techniques, including workload management and query optimization. Experience with Teradata data modeling principles and best practices. Excellent analytical, problem-solving, and troubleshooting skills specific to Teradata environments, with an aptitude for understanding cloud integration. Strong communication , collaboration, and interpersonal skills, with the ability to explain complex technical concepts clearly, including those bridging Teradata and GCP. Familiarity with Google Cloud Platform (GCP) and its core data services (e.g., BigQuery , Cloud Storage, Dataflow). Preferred Qualifications/ Skills Teradata certifications. Google Cloud certifications (e.g., Cloud Architect, Data Engineer). Experience with Teradata Viewpoint and other monitoring tools. Knowledge of data integration tools (e.g., Informatica, Talend) and their interaction with both Teradata and GCP. Experience with workload management and prioritization in Teradata, and how it might be approached in GCP. Familiarity with data security concepts and implementation within both Teradata and GCP. Experience with migrating data to or from Teradata, especially to GCP. Exposure to cloud-based data warehousing solutions (specifically BigQuery ) and their architectural differences from Teradata. Scripting skills (e.g., Shell, Python) for automation of tasks across both Teradata and GCP. Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Senior Principal Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 16, 2025, 11:49:57 PM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 1 week ago
5.0 years
1 - 9 Lacs
Hyderābād
On-site
We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible. As a Lead Software Engineer at JPMorgan Chase within the Consumer and community banking - Data technology, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Becomes a technical mentor in the team Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) In-depth knowledge of the financial services industry and their IT systems Preferred qualifications, capabilities, and skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have)
Posted 1 week ago
5.0 years
0 Lacs
Hyderābād
On-site
JOB DESCRIPTION We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible. As a Lead Software Engineer at JPMorgan Chase within the Consumer and community banking - Data technology, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Becomes a technical mentor in the team Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) In-depth knowledge of the financial services industry and their IT systems Preferred qualifications, capabilities, and skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have) ABOUT US
Posted 1 week ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Overview Deputy Director - Data Engineering PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCo’s global business scale to enable business insights, advanced analytics, and new product development. PepsiCo’s Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. What PepsiCo Data Management and Operations does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/curation, and access to the PepsiCo corporate data asset Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company. As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Responsibilities Data engineering lead role for D&Ai data modernization (MDIP) Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The can didate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Manage a team of data engineers and data analysts by delegating project responsibilities and managing their flow of work as well as empowering them to realize their full potential. Design, structure and store data into unified data models and link them together to make the data reusable for downstream products. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Create reusable accelerators and solutions to migrate data from legacy data warehouse platforms such as Teradata to Azure Databricks and Azure SQL. Enable and accelerate standards-based development prioritizing reuse of code, adopt test-driven development, unit testing and test automation with end-to-end observability of data Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality, performance and cost. Collaborate with internal clients (product teams, sector leads, data science teams) and external partners (SI partners/data providers) to drive solutioning and clarify solution requirements. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects to build and support the right domain architecture for each application following well-architected design standards. Define and manage SLA’s for data products and processes running in production. Create documentation for learnings and knowledge transfer to internal associates. Qualifications 12+ years of engineering and data management experience Qualifications 12+ years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 8+ years of experience with Data Lakehouse, Data Warehousing, and Data Analytics tools. 6+ years of experience in SQL optimization and performance tuning on MS SQL Server, Azure SQL or any other popular RDBMS 6+ years of experience in Python/Pyspark/Scala programming on big data platforms like Databricks 4+ years in cloud data engineering experience in Azure or AWS. Fluent with Azure cloud services. Azure Data Engineering certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one business intelligence tool such as Power BI or Tableau Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like ADO, Github and CI/CD tools for DevOps automation and deployments. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals. Ability to lead others without direct authority in a matrixed environment. Comfortable working in a hybrid environment with teams consisting of contractors as well as FTEs spread across multiple PepsiCo locations. Domain Knowledge in CPG industry with Supply chain/GTM background is preferred. Show more Show less
Posted 1 week ago
15.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction Joining the IBM Technology Expert Labs teams means you'll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you'll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for: DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 15+ years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills. Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA). soft skills. Show more Show less
Posted 1 week ago
15.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction Joining the IBM Technology Expert Labs teams means you'll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you'll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for: DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 15+ years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA).Soft Skills Show more Show less
Posted 1 week ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Req ID: 328445 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Intelligence Senior Analyst to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Job Description Role Description : As a Cognos Developer, you will be a key contributor to our business intelligence initiatives. You will be responsible for building, testing, and deploying Cognos reports, managing Framework Manager packages, and ensuring the accuracy and reliability of our data visualizations. Your ability to collaborate with cross-functional teams and your expertise in Cognos Analytics will be essential for success in this role. Responsibilities : Design, develop, and deploy Cognos reports and dashboards using Cognos Analytics 11/12. Build and maintain Cognos reports using Framework Manager and Report Studio. Develop reports with Drill Through, List, Crosstab, and Prompt pages, Page grouping & sections. Build, manage, and maintain Framework Manager packages. Ensure data integrity and consistency within Cognos packages. Optimize Framework Manager performance. Understand and apply data warehousing concepts. Possess basic knowledge of Extract, Transform, Load (ETL) processes. Write and optimize SQL queries for data retrieval and manipulation. Perform data analysis and validation using SQL. Build, test, and deploy Cognos reports and dashboards. Ensure reports meet business requirements and quality standards. Analyze business requirements and translate them into technical specifications. Collaborate with stakeholders to understand reporting needs. Create and maintain technical documentation for Cognos reports and packages. Provide support to end-users on Cognos reporting. Collaborate with cross-functional teams to deliver business intelligence solutions. Communicate effectively with team members and stakeholders. Technical Skills : Cognos Analytics , Oracle , Teradata Experience in Cognos Analytics 11/12 (Data Modules, Framework Manager Packages, Report Studio, Visualization Gallery, Cognos Dashboard). Good knowledge in Cognos packages using Framework Manager. Design and develop reports using Report Studio. Good SQL skills for data retrieval and manipulation. Experience in data warehousing and business intelligence. Basic knowledge of Extract, Transform, Load (ETL) processes.E15- Design, develop, and deploy Cognos reports and dashboards using Cognos Analytics 11/12. Build and maintain Cognos reports using Framework Manager and Report Studio. Develop reports with Drill Through, List, Crosstab, and Prompt pages, Page grouping & sections. Utilize Cognos Data Modules and Visualization Gallery to create interactive and insightful visualizations. Build, manage, and maintain Framework Manager packages. Ensure data integrity and consistency within Cognos packages. Optimize Framework Manager performance. Understand and apply data warehousing concepts. Possess basic knowledge of Extract, Transform, Load (ETL) processes. Write and optimize SQL queries for data retrieval and manipulation. Perform data analysis and validation using SQL. Build, test, and deploy Cognos reports and dashboards. Ensure reports meet business requirements and quality standards. Analyze business requirements and translate them into technical specifications. Collaborate with stakeholders to understand reporting needs. Create and maintain technical documentation for Cognos reports and packages. Provide support to end-users on Cognos reporting. Collaborate with cross-functional teams to deliver business intelligence solutions. Communicate effectively with team members and stakeholders. Technical Skills :Cognos Analytics : Experience in Cognos Analytics 11/12 Good knowledge in Cognos packages using Framework Manager. Design and develop reports using Report Studio. Good SQL skills for data retrieval and manipulation. Experience in data warehousing and business intelligence. Basic knowledge of Extract, Transform, Load (ETL) processes. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We're hiring for Teradata Administrator with US product based company with Pune location ,Permanent Opportunity Location: Hinjewadi Phase II (hybrid) Shift: 9:30 PM - 6:30 AM IST (Indian Standard Time). Night Shift Experience - 3+ years Are you a database expert with a passion for high-impact infrastructure work and a strong command of enterprise systems? We’re looking for a Physical Database Architect who can help design, build, optimize, and support mission-critical database environments. What You'll Do: • Translate logical data models into robust, scalable physical database architectures • Drive physical database design, deployment, performance tuning, and security configuration • Serve as the primary development database contact, collaborating with application teams and production DBAs • Support incident resolution, performance troubleshooting, and proactive monitoring • Align IT infrastructure with business strategies by partnering with BAs, architects, and development teams • Provide technical consultation on infrastructure planning and implementation • Evaluate service options, recommend improvements, and ensure designs meet enterprise architecture standards Required Experience: ✔️ 2+ years working with Teradata database technologies ✔️ 2+ years of experience in database performance tuning and troubleshooting ✔️ 2+ years of hands-on SQL or similar query language use ✔️ 1+ years working with database monitoring tools such as Foglight or equivalents ✔️ 2+ years supporting development projects ✔️ 1+ years of experience in database administration What You Bring: • Strong technical foundation in database and infrastructure design • Excellent cross-functional collaboration skills with a focus on delivery and uptime • Proactive mindset with strong problem-solving and performance analysis abilities • Commitment to continuous improvement, documentation, and best practices If you’re ready to make an impact by driving scalable, reliable database solutions—we want to hear from you! Kindly share updated cv on rakhee.su@peoplefy.com Show more Show less
Posted 1 week ago
0.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients . Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant - ML Engineer ! In this role, we are looking for candidates who have relevant years of experienc e in d esigning and developing machine learning and deep learning system . Who have professional software development experience . Hands on r unning machine learning tests and experiments . Implementing appropriate ML algorithms engineers. Responsibilities Drive the vision for modern data and analytics platform to deliver well architected and engineered data and analytics products leveraging cloud tech stack and third-party products Close the gap between ML research and production to create ground-breaking new products, features and solve problems for our customers Design, develop, test, and deploy data pipelines, machine learning infrastructure and client-facing products and services Build and implement machine learning models and prototype solutions for proof-of-concept Scale existing ML models into production on a variety of cloud platforms Analyze and resolve architectural problems, working closely with engineering, data science and operations teams Qualifications we seek in you! Minimum Q ualifications / Skills Good years experience B achelor%27s degree in computer science engineering, information technology or BSc in Computer Science, Mathematics or similar field Master&rsquos degree is a plus Integration - APIs, micro- services and ETL/ELT patterns DevOps (Good to have) - Ansible, Jenkins, ELK Containerization - Docker, Kubernetes etc Orchestration - Google composer Languages and scripting: Python, Scala Java etc Cloud Services - GCP, Snowflake Analytics and ML tooling - Sagemaker , ML Studio Execution Paradigm - low latency/Streaming, batch Preferred Q ualifications / Skills Data platforms - DBT, Fivetran and Data Warehouse (Teradata, Redshift, BigQuery , Snowflake etc.) Visualization Tools - PowerBI , Tableau Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at and on , , , and . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .
Posted 1 week ago
0.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients . Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant - ML Engineer ! In this role, we are looking for candidates who have relevant years of experienc e in d esigning and developing machine learning and deep learning system . Who have professional software development experience . Hands on r unning machine learning tests and experiments . Implementing appropriate ML algorithms engineers. Responsibilities Drive the vision for modern data and analytics platform to deliver well architected and engineered data and analytics products leveraging cloud tech stack and third-party products Close the gap between ML research and production to create ground-breaking new products, features and solve problems for our customers Design, develop, test, and deploy data pipelines, machine learning infrastructure and client-facing products and services Build and implement machine learning models and prototype solutions for proof-of-concept Scale existing ML models into production on a variety of cloud platforms Analyze and resolve architectural problems, working closely with engineering, data science and operations teams Qualifications we seek in you! Minimum Q ualifications / Skills Good years experience B achelor%27s degree in computer science engineering, information technology or BSc in Computer Science, Mathematics or similar field Master&rsquos degree is a plus Integration - APIs, micro- services and ETL/ELT patterns DevOps (Good to have) - Ansible, Jenkins, ELK Containerization - Docker, Kubernetes etc Orchestration - Google composer Languages and scripting: Python, Scala Java etc Cloud Services - GCP, Snowflake Analytics and ML tooling - Sagemaker , ML Studio Execution Paradigm - low latency/Streaming, batch Preferred Q ualifications / Skills Data platforms - DBT, Fivetran and Data Warehouse (Teradata, Redshift, BigQuery , Snowflake etc.) Visualization Tools - PowerBI , Tableau Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at and on , , , and . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .
Posted 1 week ago
5.0 - 7.0 years
8 - 10 Lacs
Kochi
Work from Office
As an Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Design and implement efficient database schemas and data models using Teradata. Optimize SQL queries and stored procedures for performance. Perform database administration tasks including installation, configuration, and maintenance of Teradata systems Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 week ago
6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
As a Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Drive technical conversations with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Qualifications Preffered 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. Or 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and other cloud products (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations. Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
As a Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Drive technical sales with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Qualifications 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. OR 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. Responsibilities What you’ll be doing... Publishing various insights & inferences for technical and senior leadership to make informed decisions. Collecting, processing, and performing statistical analysis on large datasets to discover useful information, suggest conclusions, and support decision-making Identifying, defining, and scoping moderately complex data analytics problems in the Enterprise Cyber Security domain. Developing cross-domain strategies for increased network security and resiliency of critical infrastructure, working with researchers in other disciplines Designing, developing and maintaining applications and databases by evaluating business needs, analyzing requirements and developing software systems. Researching, developing, designing and implementing machine learning algorithms for cyber threat detection in Enterprise Security and IAM functions and transform data points into objective Executing full software development life cycle (SDLC) – concept, design, build, deploy, test, release and support. Managing daily activities include but are not limited to attending project calls to groom new user stories, acting as a liaison between business and technical teams, collecting, organizing, and interpreting data using statistical tools,developing user interface components using programming languages, and visualization techniques. All aspects of a project from analysis, testing, implementation and support after launch. What We’re Looking For... Experience with SQL Server/Teradata/DB2 databases. Experience with advanced analytics using R or Python in performing data analysis. Fundamental knowledge in and/or experience applying algorithms in one or more of the following Machine Learning areas: anomaly detection, one/few-shot learning, deep learning, unsupervised feature learning, ensemble methods, probabilistic graphical models, and/or reinforcement learning. Experience with visualization software like Tableau, Qlik, Looker or Thoughtspot to tell data-driven stories to business users at all levels Broad knowledge of IT Security such as end point, network and cloud Security Developing user interface components and implementing them following well-known React.js workflows (such as Flux or Redux). You will ensure that these components and the overall application are robust and easy to maintain. You will coordinate with the rest of the team working on different layers of the infrastructure. Your duties will include designing software solutions to meet project requirements, maintaining and refactoring existing code, writing tests, and fixing bugs. Ability to communicate comprehensive knowledge effectively across multi-disciplinary teams and to non-cyber experts, as well as demonstrate the proficient interpersonal skills necessary to effectively collaborate in a team environment. Following appropriate systems life cycle methodologies, Agile and Waterfall, for quality and maintainability and communicates status to IT management. Staying abreast of changes and advances in data warehousing technology. Perform the role of detective as you dig deep into the data warehouse to ensure new data requirements are not already available for the business to access, if not there, how the new data will fit in, be ingested and exposed in a usable manner You’ll need to have.. Bachelor degree with two or more years of work experience. Two or more years of professional experience in data analytics, business analysis or comparable analytics position. Ability to write SQL against a relational database in order to analyze and test data. Two or more Years of professional experience in working on IT Security domain Familiarity with RESTful APIs Experience with popular React.js workflows (such as Flux or Redux) Exposure to Threat, Risk and Vulnerability Management is added advantage Familiarity with Application dev Even better if you have one or more of the following: Bachelor degree in Computer Science/Information Systems or an equivalent combination of education and work experience Strong verbal and written communication skills Ability to work in a team environment. Familiarity with modern front-end build pipelines and tools Knowledge of modern authorization mechanisms, such as JSON Web Token When you join Verizon You’ll be doing work that matters alongside other talented people, transforming the way people, businesses and things connect with each other. Beyond powering America’s fastest and most reliable network, we’re leading the way in broadband, cloud and security solutions, Internet of Things and innovating in areas such as video entertainment. Of course, we will offer you great pay and benefits, but we’re about more than that. Verizon is a place where you can craft your own path to greatness. Whether you think in code, words, pictures or numbers, find your future at Verizon. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. Responsibilities What you’ll be doing... Publishing various insights & inferences for technical and senior leadership to make informed decisions. Collecting, processing, and performing statistical analysis on large datasets to discover useful information, suggest conclusions, and support decision-making Identifying, defining, and scoping moderately complex data analytics problems in the Enterprise Cyber Security domain. Developing cross-domain strategies for increased network security and resiliency of critical infrastructure, working with researchers in other disciplines Designing, developing and maintaining applications and databases by evaluating business needs, analyzing requirements and developing software systems. Researching, developing, designing and implementing machine learning algorithms for cyber threat detection in Enterprise Security and IAM functions and transform data points into objective Executing full software development life cycle (SDLC) – concept, design, build, deploy, test, release and support. Managing daily activities include but are not limited to attending project calls to groom new user stories, acting as a liaison between business and technical teams, collecting, organizing, and interpreting data using statistical tools,developing user interface components using programming languages, and visualization techniques. All aspects of a project from analysis, testing, implementation and support after launch. What We’re Looking For... Experience with SQL Server/Teradata/DB2 databases. Experience with advanced analytics using R or Python in performing data analysis. Fundamental knowledge in and/or experience applying algorithms in one or more of the following Machine Learning areas: anomaly detection, one/few-shot learning, deep learning, unsupervised feature learning, ensemble methods, probabilistic graphical models, and/or reinforcement learning. Experience with visualization software like Tableau, Qlik, Looker or Thoughtspot to tell data-driven stories to business users at all levels Broad knowledge of IT Security such as end point, network and cloud Security Developing user interface components and implementing them following well-known React.js workflows (such as Flux or Redux). You will ensure that these components and the overall application are robust and easy to maintain. You will coordinate with the rest of the team working on different layers of the infrastructure. Your duties will include designing software solutions to meet project requirements, maintaining and refactoring existing code, writing tests, and fixing bugs. Ability to communicate comprehensive knowledge effectively across multi-disciplinary teams and to non-cyber experts, as well as demonstrate the proficient interpersonal skills necessary to effectively collaborate in a team environment. Following appropriate systems life cycle methodologies, Agile and Waterfall, for quality and maintainability and communicates status to IT management. Staying abreast of changes and advances in data warehousing technology. Perform the role of detective as you dig deep into the data warehouse to ensure new data requirements are not already available for the business to access, if not there, how the new data will fit in, be ingested and exposed in a usable manner You’ll need to have.. Bachelor degree with two or more years of work experience. Two or more years of professional experience in data analytics, business analysis or comparable analytics position. Ability to write SQL against a relational database in order to analyze and test data. Two or more Years of professional experience in working on IT Security domain Familiarity with RESTful APIs Experience with popular React.js workflows (such as Flux or Redux) Exposure to Threat, Risk and Vulnerability Management is added advantage Familiarity with Application dev Even better if you have one or more of the following: Bachelor degree in Computer Science/Information Systems or an equivalent combination of education and work experience Strong verbal and written communication skills Ability to work in a team environment. Familiarity with modern front-end build pipelines and tools Knowledge of modern authorization mechanisms, such as JSON Web Token When you join Verizon You’ll be doing work that matters alongside other talented people, transforming the way people, businesses and things connect with each other. Beyond powering America’s fastest and most reliable network, we’re leading the way in broadband, cloud and security solutions, Internet of Things and innovating in areas such as video entertainment. Of course, we will offer you great pay and benefits, but we’re about more than that. Verizon is a place where you can craft your own path to greatness. Whether you think in code, words, pictures or numbers, find your future at Verizon. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less
Posted 1 week ago
6.0 - 10.0 years
8 - 15 Lacs
Hyderabad
Work from Office
ABOUT THE ROLE Role Description: We are seeking a Reference Data Management Senior Analyst who a s the Reference Data Product team member of the Enterprise Data Management organization, will be responsible for managing and promoting the use of reference data, partnering with business Subject Mater Experts on creation of vocabularies / taxonomies and ontologies, and developing analytic solutions using semantic technologies . Roles & Responsibilities: Work with Reference Data Product Owner, external resources and other engineers as part of the product team Develop and maintain semantically appropriate concepts Identify and address conceptual gaps in both content and taxonomy Maintain ontology source vocabularies for new or edited codes Support product teams to help them leverage taxonomic solutions Analyze the data from public/internal datasets. Develop a Data Model/schema for taxonomy. Create a taxonomy in Semaphore Ontology Editor. Perform Bulk-import data templates into Semaphore to add/update terms in taxonomies. Prepare SPARQL queries to generate adhoc reports. Perform Gap Analysis on current and updated data Maintain taxonomies in Semaphore through Change Management process. Develop and optimize automated data ingestion / pipelines through Python/PySpark when APIs are available Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Identify and resolve complex data-related challenges Participate in sprint planning meetings and provide estimations on technical implementation. Basic Qualifications and Experience: Masters degree with 6 years of experience in Business, Engineering, IT or related field OR Bachelors degree with 8 years of experience in Business, Engineering, IT or related field OR Diploma with 9+ years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Knowledge of controlled vocabularies, classification, ontology and taxonomy Experience in ontology development using Semaphore, or a similar tool Hands on experience writing SPARQL queries on graph data Excellent problem-solving skills and the ability to work with large, complex datasets Understanding of data modeling, data warehousing, and data integration concepts Good-to-Have Skills: Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.). Experience using cloud services such as AWS or Azure or GCP Experience working in Product Teams environment Knowledge of Python/R, Databricks, cloud data platforms Knowledge of NLP (Natural Language Processing) and AI (Artificial Intelligence) for extracting and standardizing controlled vocabularies. Strong understanding of data governance frameworks, tools, and best practices Professional Certifications : Databricks Certificate preferred SAFe Practitioner Certificate preferred Any Data Analysis certification (SQL, Python) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
0 years
4 - 9 Lacs
Pune
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. About EY GDS Global Delivery Services (GDS) is a dynamic and truly global delivery network. Across our six locations, we work with teams from all EY service lines, geographies and sectors, and play a vital role in the delivery of the EY growth strategy. We operate from six countries and sixteen cities: Argentina (Buenos Aires) China (Dalian) India (Bangalore, Chennai, Gurgaon, Hyderabad, Kochi, Kolkata, Mumbai, Noida, Trivandrum) Philippines (Manila) Poland (Warsaw and Wroclaw) UK (Manchester, Liverpool) Careers in EY Global Delivery Services Join a team of over 50,000 people, working across borders, to provide innovative and strategic business solutions to EY member firms around the world. Join one of our dynamic teams From accountants to coders, we offer a wide variety of fulfilling career opportunities that span all business disciplines Our Consulting practice provides differentiated focus on the key business themes to help our clients solve better questions around technology. Our vision is to be recognized as a leading provider of differentiated technology consulting services, harnessing new disruptive technology, alliances and attracting talented people to solve our clients' issues. It's an exciting time to join us and grow your career as a technology professional. A technology career is about far more than leading-edge innovations. It’s about the application of these technologies in the real world to make a real, meaningful impact. We are looking for highly motivated, articulate individuals who have the skills to the technology lifecycle and are passionate about designing innovative solutions to solve complex business problems. Your career in Consulting can span across these technology areas/ services lines: Digital Technologies: We are a globally integrated digital architecture and engineering team. Our mission is to deliver tailored, custom-built end to end solutions to our customers that are Digital, Cloud Native and Open Source. Our skills include Experience design, UI development, Design Thinking, Architecture & Design, Full stack development (.Net/ Java/ SharePoint/ Power Platform), Emerging Technologies like Block Chain, IoT, AR\VR, Drones, Cloud and DevSecOps. We use industrialized techniques, built on top of agile methods utilizing our global teams to deliver end to end solutions at best unit cost proposition. Testing Services: We are the yardstick of quality software product. We break something to make the product stronger and successful. We provide entire gamut of testing services including Busines / User acceptance testing. Hence this is a team with all round skills such as functional, technical and process. Data & Analytics: Data and Analytics is amongst the largest and most versatile practices within EY. Our sector and domain expertise combined with technical skills in data, cloud, advanced analytics and artificial intelligence differentiates us in the industry. Our talented team possesses cross-sector and cross-domain expertise and a wide array of skills in Information Management (IM), Business Intelligence (BI), Advance Analytics (AA) and Artificial Intelligence (AI) Oracle: We provide one-stop solution for end-to-end project implementation enabled by Oracle and IBM Products. We use proven methodologies, tools and accelerators to jumpstart and support large Risk and Finance Transformation. We develop solutions using various languages such as SQL or PL/ SQL, Java, Java Script, Python, IBM Maximo and other Oracle Utilities. We also provide consulting services for streamlining the current reporting process using various Enterprise Performance Management tools. SAP: By building on SAP’s S/4HANA digital core and cloud services, EY and SAP are working to help organizations leverage industry-leading technologies to improve operational performance. This collaboration helps drive digital transformation for our clients across areas including finance, human resources, supply chain and procurement. Our goal is to support clients as they initiate or undergo major transformation. Our capabilities span end-to-end solution implementation services from strategy and architecture to production deployment. EY supports clients in three main areas, Technology implementation support, Enterprise and Industry application implementation, Governance Risk Compliance (GRC) Technology. Banking and Capital Market Services: Banking and Capital Market Services companies are transforming their complex tax and finance functions with technologies such as AI and ML. With the right blend of core competencies, tax and finance personnel will shift to data, process and technology skills to service global clients on their Core Banking Platforms and support their business / digital transformation like Deposit system replacements, lending / leasing modernization, Cloud–native architecture (Containerization) etc. Wealth and Asset Management: We help our clients thrive in a transformative age by providing innovative services to global and domestic asset management clients to increase efficiency, effectiveness and manage the overall impact on bottom line profitability by leveraging the technology, data and digital teams. We do many operational efficiency programs and Technology Enabled Transformation to re-platform their front and Back offices with emerging technologies like AI, ML, Blockchain etc. Insurance Transformation: The current changing Macroeconomic trends continue to challenge Insurers globally. However, with disruptive technologies – including IoT, autonomous vehicles, Blockchain etc, we help companies through these challenges and create innovative strategies to transform their business through technology enabled transformation programs. We provide end to end services to Global P&C (General), Life and Health Insurers, Reinsurers and Insurance brokers. Cyber Security: The ever-increasing risk and complexity surrounding cybersecurity and privacy has put cybersecurity at the top of the agenda for senior management, the Board of Directors, and regulators. We help our clients to understand and quantify their cyber risk, prioritize investments, and embed security, privacy and resilience into every digitally-enabled initiative – from day one. Technology Risk: A practice that is a unique, industry-focused business unit that provides a broad range of integrated services where you’ll contribute technically to IT Risk and Assurance client engagements and internal projects. An important part of your role will be to actively establish, maintain and strengthen internal and external relationships. You’ll also identify potential business opportunities for EY within existing engagements and escalate these as appropriate. Similarly, you’ll anticipate and identify risks within engagements and share any issues with senior members of the team. Behavioral Competencies: Adaptive to team and fosters collaborative approach Innovative approach to the project, when required Shows passion and curiosity, desire to learn and can think digital Agile mindset and ability to multi-task Must have an eye for detail Skills needed: Should have understanding and/or experience of software development best practices and software development life cycle Understanding of one/more programming languages such as Java/ .Net/ Python, data analytics or databases such as SQL/ Oracle/ Teradata etc. Internship in a relevant technology domain will be an added advantage Qualification: BE - B. Tech / (IT/ Computer Science/ Circuit branches) Should have secured 60% and above No active Backlogs EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
0 years
0 Lacs
Chennai
Remote
Chennai, India Hyderabad, India Bangalore, India Job ID: R-1077091 Apply prior to the end date: June 28th, 2025 When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What you’ll be doing... Responsibilities: Publishing various insights & inferences for technical and senior leadership to make informed decisions. Collecting, processing, and performing statistical analysis on large datasets to discover useful information, suggest conclusions, and support decision-making Identifying, defining, and scoping moderately complex data analytics problems in the Enterprise Cyber Security domain. Developing cross-domain strategies for increased network security and resiliency of critical infrastructure, working with researchers in other disciplines Designing, developing and maintaining applications and databases by evaluating business needs, analyzing requirements and developing software systems. Researching, developing, designing and implementing machine learning algorithms for cyber threat detection in Enterprise Security and IAM functions and transform data points into objective Executing full software development life cycle (SDLC) – concept, design, build, deploy, test, release and support. Managing daily activities include but are not limited to attending project calls to groom new user stories, acting as a liaison between business and technical teams, collecting, organizing, and interpreting data using statistical tools,developing user interface components using programming languages, and visualization techniques. All aspects of a project from analysis, testing, implementation and support after launch. What we’re looking for... Experience with SQL Server/Teradata/DB2 databases. Experience with advanced analytics using R or Python in performing data analysis. Fundamental knowledge in and/or experience applying algorithms in one or more of the following Machine Learning areas: anomaly detection, one/few-shot learning, deep learning, unsupervised feature learning, ensemble methods, probabilistic graphical models, and/or reinforcement learning. Experience with visualization software like Tableau, Qlik, Looker or Thoughtspot to tell data-driven stories to business users at all levels Broad knowledge of IT Security such as end point, network and cloud Security Developing user interface components and implementing them following well-known React.js workflows (such as Flux or Redux). You will ensure that these components and the overall application are robust and easy to maintain. You will coordinate with the rest of the team working on different layers of the infrastructure. Your duties will include designing software solutions to meet project requirements, maintaining and refactoring existing code, writing tests, and fixing bugs. Ability to communicate comprehensive knowledge effectively across multi-disciplinary teams and to non-cyber experts, as well as demonstrate the proficient interpersonal skills necessary to effectively collaborate in a team environment. Following appropriate systems life cycle methodologies, Agile and Waterfall, for quality and maintainability and communicates status to IT management. Staying abreast of changes and advances in data warehousing technology. Perform the role of detective as you dig deep into the data warehouse to ensure new data requirements are not already available for the business to access, if not there, how the new data will fit in, be ingested and exposed in a usable manner You’ll need to have.. Bachelor degree with two or more years of work experience. Two or more years of professional experience in data analytics, business analysis or comparable analytics position. Ability to write SQL against a relational database in order to analyze and test data. Two or more Years of professional experience in working on IT Security domain Familiarity with RESTful APIs Experience with popular React.js workflows (such as Flux or Redux) Exposure to Threat, Risk and Vulnerability Management is added advantage Familiarity with Application dev Even better if you have one or more of the following: Bachelor degree in Computer Science/Information Systems or an equivalent combination of education and work experience Strong verbal and written communication skills Ability to work in a team environment. Familiarity with modern front-end build pipelines and tools Knowledge of modern authorization mechanisms, such as JSON Web Token When you join Verizon You’ll be doing work that matters alongside other talented people, transforming the way people, businesses and things connect with each other. Beyond powering America’s fastest and most reliable network, we’re leading the way in broadband, cloud and security solutions, Internet of Things and innovating in areas such as video entertainment. Of course, we will offer you great pay and benefits, but we’re about more than that. Verizon is a place where you can craft your own path to greatness. Whether you think in code, words, pictures or numbers, find your future at Verizon. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Apply Now Save Saved Open sharing options Share Related Jobs Senior Engineer Consultant-AI Science Save Chennai, India, +1 other location Technology Software Engineer Consultant- III Save Chennai, India Technology Engr IV-Security Engrg Save Chennai, India, +1 other location Technology Shaping the future. Connect with the best and brightest to help innovate and operate some of the world’s largest platforms and networks.
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview C5i C5i is a pure-play AI & Analytics provider that combines the power of human perspective with AI technology to deliver trustworthy intelligence. The company drives value through a comprehensive solution set, integrating multifunctional teams that have technical and business domain expertise with a robust suite of products, solutions, and accelerators tailored for various horizontal and industry-specific use cases. At the core, C5i’s focus is to deliver business impact at speed and scale by driving adoption of AI-assisted decision-making. C5i caters to some of the world’s largest enterprises, including many Fortune 500 companies. The company’s clients span Technology, Media, and Telecom (TMT), Pharma & Lifesciences, CPG, Retail, Banking, and other sectors. C5i has been recognized by leading industry analysts like Gartner and Forrester for its Analytics and AI capabilities and proprietary AI-based platforms. Global offices United States | Canada | United Kingdom | United Arab of Emirates | India Job Summary We are looking for experienced Data Modelers to support large-scale data engineering and analytics initiatives. The role involves developing logical and physical data models, working closely with business and engineering teams to define data requirements, and ensuring alignment with enterprise standards. • Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, Spark, Data Bricks Delta Lakehouse or other Cloud data warehousing technologies. • Governs data design/modelling – documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. • Develop a deep understanding of the business domains like Customer, Sales, Finance, Supplier, and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. • Drive collaborative reviews of data model design, code, data, security features to drive data product development. • Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; SAP Data Model. • Develop reusable data models based on cloud-centric, code-first approaches to data management and data mapping. • Partner with the data stewards team for data discovery and action by business customers and stakeholders. • Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. • Assist with data planning, sourcing, collection, profiling, and transformation. • Support data lineage and mapping of source system data to canonical data stores. • Create Source to Target Mappings (STTM) for ETL and BI developers. Skills needed: • Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) • Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. • Experience with version control systems like GitHub and deployment & CI tools. • Experience of metadata management, data lineage, and data glossaries is a plus. • Working knowledge of agile development, including DevOps and DataOps concepts. • Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Retail. C5i is proud to be an equal opportunity employer. We are committed to equal employment opportunity regardless of race, color, religion, sex, sexual orientation, age, marital status, disability, gender identity, etc. If you have a disability or special need that requires accommodation, please keep us informed about the same at the hiring stages for us to factor necessary accommodations. Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job description: Job Description Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ͏ Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ͏ Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ͏ Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ͏ Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Teradata . Experience: 5-8 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome. Show more Show less
Posted 1 week ago
5.0 - 8.0 years
25 - 30 Lacs
Indore, Chennai
Work from Office
We are seeking a Senior Python DevOps Engineer to develop Python services and build CI/CD pipelines for AI/data platforms. Must have strong cloud, container, and ML workflow deployment experience. Required Candidate profile Experienced Python DevOps engineer with expertise in CI/CD, cloud, and AI platforms. Skilled in Flask/FastAPI, Airflow, MLFlow, and model deployment on Dataiku and OpenShift.
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: GCP Teradata Engineer Location: Chennai, Bangalore, Hyderabad Experience: 4-6 Years Job Summary We are seeking a GCP Data & Cloud Engineer with strong expertise in Google Cloud Platform services, including BigQuery, Cloud Run, Cloud Storage , and Pub/Sub . The ideal candidate will have deep experience in SQL coding , data pipeline development, and deploying cloud-native solutions. Key Responsibilities Design, implement, and optimize scalable data pipelines and services using GCP Build and manage cloud-native applications deployed via Cloud Run Develop complex and performance-optimized SQL queries for analytics and data transformation Manage and automate data storage, retrieval, and archival using Cloud Storage Implement event-driven architectures using Google Pub/Sub Work with large datasets in BigQuery, including ETL/ELT design and query optimization Ensure security, monitoring, and compliance of cloud-based systems Collaborate with data analysts, engineers, and product teams to deliver end-to-end cloud solutions Required Skills & Experience 4 years of experience working with Google Cloud Platform (GCP) Strong proficiency in SQL coding, query tuning, and handling complex data transformations Hands-on experience with: BigQuery Cloud Run Cloud Storage Pub/Sub Understanding of data pipeline and ETL/ELT workflows in cloud environments Familiarity with containerized services and CI/CD pipelines Experience in scripting languages (e.g., Python, Shell) is a plus Strong analytical and problem-solving skills Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Data Fullstack - Descriptive Analytics Location: Chennai Work Type: Onsite Position Description: The Analytics Service department provides system planning, engineering and operations support for enterprise Descriptive and Predictive Analytics products, as well as Big Data solutions and Analytics Data Management products. These tools are used by the Global Data Insights and Analytics (GDIA) team, data scientists, and IT service delivery partners globally to build line-of-business applications which are directly used by the end-user community. Products and platforms include Power BI, Alteryx, Informatica, Google Big Query, and more - all of which are critical to the client's rapidly evolving needs in the area of Analytics and Big Data. In addition, business intelligence reporting products such as Business Objects, Qlik Sense and WebFOCUS are used by our core line of businesses for both employees and dealers. This position is part of the Descriptive Analytics team. It is a Full Stack Engineering and Operations position, engineering and operating our strategic Power BI dashboarding and visualization platform and other products as required, such as Qlik Sense, Alteryx, Business Objects, WebFOCUS, Looker, and other new platforms as they are introduced. The person in this role will collaborate with team members to produce well-tested and documented run books, test cases, and change requests, and handle change implementations as needed. The candidate will start with primarily Operational tasks until the products are well understood and will then progress to assisting with Engineering tasks. Skills Required: GCP, Tekton, GitHub, TERRAFORM, Powershell, Openshift Experience Required: Position Qualifications: Bachelor's Degree in a relevant field At least 5 years of experience with Descriptive Analytics technologies Dev/Ops experience with Github, Tekton pipelines, Terraform code, Google Cloud Services, and PowerShell and managing large GCP installations (OR) System Administrator experience managing large multi-tenant Windows Server environments based on GCP Compute Engines or OpenShift Virtualization VMs Strong troubleshooting and problem-solving skills Understanding of Product Life Cycle Ability to coordinate issue resolution with vendors on behalf of the client Strong written and verbal communication skills Understanding of technologies like Power BI, Big Query, Teradata, SQL Server, Oracle DB2, etc. Basic understanding of database connectivity and authentication methods (ODBC, JDBC, drivers, REST, WIF, Cloud SA or vault keys, etc.) Experience Preferred: Recommended: Experience with PowerApps and Power Automate Familiarity with Jira Familiarity with the client EAA, RTP, and EAMS processes and the client security policies (GRC) Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity. Show more Show less
Posted 1 week ago
2.0 - 4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Perforce is a community of collaborative experts, problem solvers, and possibility seekers who believe work should be both challenging and fun. We are proud to inspire creativity, foster belonging, support collaboration, and encourage wellness. At Perforce, you’ll work with and learn from some of the best and brightest in business. Before you know it, you’ll be in the middle of a rewarding career at a company headed in one direction: upward. With a global footprint spanning more than 80 countries and including over 75% of the Fortune 100, Perforce Software, Inc. is trusted by the world’s leading brands to deliver solutions for the toughest challenges. The best run DevOps teams in the world choose Perforce. Position Summay: The Delphix team is seeking engineers with a passion for data security to join our data compliance engineering team. In this position, you will get the opportunity to contribute to the Hyperscale Compliance product for which development and testing is driven completely by the India Engineering Team. This product was launched into the Market in July 2022 and handles the compliance use cases for large scale datasets. Delphix Hyperscale Compliance is based on a Microservices architecture and uses a cluster of Delphix Continuous Compliance engines to achieve faster results. Delphix Continuous Compliance provides a single platform to secure and deliver data across the enterprise, ensuring that sensitive information is protected and allowing data operators to centrally manage security policies and compliance requirements. You will be responsible for writing automation tests for complex features and performing manual, performance and scale tests. You will work closely with Product Management, customers, and other engineering stakeholders to design tests for the new solution. You will also collaborate with other team members to deliver highly-scalable, secure and maintainable features. Responsibilities: Collaborate with Product Management and other stakeholders within Engineering to maintain a high bar for quality. Advocate for improvements to product quality, security, and performance Craft code that meets our internal standards for style, maintainability, and best practices for a high-scale web environment. Maintain and advocate for these standards through code reviews Scope, design and implement test automation for allocated features. Monitor and maintain test automation, across multiple platforms and configurations Requirements: 2-4 years of experience testing enterprise applications deployed on-prem and/or in the cloud using Python Proficiency in Python or related programming language Excellent analytical and problem solving skills Ability and desire to work in a fast paced, test-driven, agile, collaborative, and iterative programming environment Ability to think clearly and articulate your vision with the appropriate technical depth A desire to build great products, learn new technical areas, and dive in wherever there is a need Desired Experience: Proficiency in Docker and Kubernetes Previous experience writing automation frameworks in Python Deep understanding of file systems and operating systems Experience with large/complex relational databases, data warehouses (Oracle, SQL Server, DB2, Azure, Amazon RDS, Teradata, etc.) and other business data formats (XML, ASC X12, etc.) Come work with us! Our team members are valued for their contributions, introduced to new opportunities, and rewarded well . Perforce combines the experience and rewards of a start-up with the security of an established and privately held profitable company. If you are passionate about the technology that impacts our day-to-day lives and want to work with talented and dedicated people across the globe, apply today! www.perforce.com Please click here for: EOE & Belonging Statements | Perforce Software Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Overview TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Systems Engineering Practitioner Location: Chennai Duration: 12 Months Work Type: Onsite Position Description The Analytics Service department provides system planning, engineering and operations support for enterprise Descriptive and Predictive Analytics products, as well as Big Data solutions and Analytics Data Management products. These tools are used by the Global Data Insights and Analytics (GDIA) team, data scientists, and IT service delivery partners globally to build line-of-business applications which are directly used by the end-user community. Products and platforms include Power BI, Alteryx, Informatica, Google Big Query, and more - all of which are critical to the client's rapidly evolving needs in the area of Analytics and Big Data. In addition, business intelligence reporting products such as Business Objects, Qlik Sense and WebFOCUS are used by our core line of businesses for both employees and dealers. This position is part of the Descriptive Analytics team. It is a Full Stack Engineering and Operations position, engineering and operating our strategic Power BI dashboarding and visualization platform and other products as required, such as Qlik Sense, Alteryx, Business Objects, WebFOCUS, Looker, and other new platforms as they are introduced. The person in this role will collaborate with team members to produce well-tested and documented run books, test cases, and change requests, and handle change implementations as needed. The candidate will start with primarily Operational tasks until the products are well understood and will then progress to assisting with Engineering tasks. Skills Required GCP, Tekton, GitHub, TERRAFORM, Powershell, Openshift Experience Required Position Qualifications: Bachelor's Degree in a relevant field At least 5 years of experience with Descriptive Analytics technologies Dev/Ops experience with Github, Tekton pipelines, Terraform code, Google Cloud Services, and PowerShell and managing large GCP installations (OR) System Administrator experience managing large multi-tenant Windows Server environments based on GCP Compute Engines or OpenShift Virtualization VMs Strong troubleshooting and problem-solving skills Understanding of Product Life Cycle Ability to coordinate issue resolution with vendors on behalf of the client Strong written and verbal communication skills Understanding of technologies like Power BI, Big Query, Teradata, SQL Server, Oracle DB2, etc. Basic understanding of database connectivity and authentication methods (ODBC, JDBC, drivers, REST, WIF, Cloud SA or vault keys, etc.) Experience Preferred Recommended: Experience with PowerApps and Power Automate Familiarity with Jira Familiarity with the client EAA, RTP, and EAMS processes and the client security policies (GRC) Education Required Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity. Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane