Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 years
5 - 40 Lacs
Gurugram, Haryana, India
On-site
Location: Bangalore, Pune, Chennai, Kolkata ,Gurugram Experience:5-15 Years Work Mode: Hybrid Mandatory Skills: Snowflake/Azure Data Factory/ PySpark / Databricks/Snow pipe Good to Have-Snowpro Certification Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF. Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussions with client architect and team members Orchestrate the data pipelines in the scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: pipelines,pyspark,pl/sql,snowflake,reporting,snowpipe,databricks,spark,azure data factory,projects,data warehouse,unix shell scripting,snowsql,troubleshooting,rdbms,sql,data management principles,data,query optimization,snow pipe,skills,azure datafactory,nosql databases,circleci,git,terraform,snowflake utilities,performance tuning,etl,azure,architect
Posted 2 days ago
5.0 years
3 - 3 Lacs
Noida
On-site
Dear all, We are looking to hire young, dynamic, presentable CRE-Aftersales/service at Noida - 63 Location. Position- CRE-Service Experience- 5+ years in premium segment Location- sector 63 Noida Salary- Decent hike on last drawn. Skills-: Excellent written & verbal skills, Knowledge of excel(basic&advance), Result-oriented. Job Responsibilities-: . Can able to make calls for Service due reminder. . Can be able to work on excel like appointment sheet, pick & drop driver details. . Emailing for service appointment confirmation for online appointment. . Can able to handle customer query on call and provide assistance related to vehicle pickup, service due, any breakdown, towing car etc. #references would be highly appreciated share with your network and jobseekers Interested candidates can share resume on hr@bmw-deutschemotoren.in Regards HRD Job Type: Full-time Pay: ₹25,000.00 - ₹30,000.00 per month Benefits: Leave encashment Schedule: Day shift Language: Hindi (Preferred) English (Preferred) Work Location: In person Expected Start Date: 01/08/2025
Posted 2 days ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Position Summary... What you'll do... The Walmart Enterprise Business Services (EBS) team focuses on building technology for Finance, People Systems, Indirect procurement and Associate experience. The goal of the EBS team is to provide customer grade experience to all our associates, to empower them and reduce the overall SG&A(Selling, General and Administrative). Drive the next generation Walmart retail systems platform and services by conceptualizing, designing, building and deploying highly scalable and robust SOA solutions. You will use your engineering experience and technical skill to materially affect how millions of orders are processed and fulfilled every day. Design high performance and scalable solutions that meet the needs of millions of Walmart customers and its next generation Logistic system. Build solutions that enable sophisticated analytics on terabytes of data collected from various sources. Implement cutting edge models and algorithms that operate on massive amounts of data. Develop high performance and scalable solutions that extract, transform, and load big data. What you'll do: Deliver data engineering responsibilities independently Consults design with tech leads and drive deliveries Works on a specific modules or part of the system. Develops clear objectives about the module functionality. Decomposes functionality to tasks and works on them independently. Must have good knowledge on the Best Practices in Data Migration. Must be a Mentor and Guide to the Team on Data Migration Framework and Best Practices. Must be Experienced in Data Profiling and Data Quality What will you bring: At least 7 years of experience in SAP BODS, SAP IS, Master data Consulting, problem definition, Architecture/Design Detailing of Processes At least 7 years of experience in SAP Data Migration with good exposure to S4 data migration Able to independently handle the Master Data objects, Transaction data objects including all the ETL activities, Experience in loading the data from Legacy sources into the S/4 HANA or ECC system. Handling the data migration, loading the data using BODS from SAP ECC - S4/HANA in IDOCs is preferred Must have led at least 2 data migration projects. Good understanding of ETL tools Hands on experience in data load using LTMC, LSMW Understanding of ETL tool options Hands on experience of ETL tools like BODS. Must have good knowledge to Implement the Data Migration Framework. (From Extraction to Loading) Must have good knowledge on the Best Practices in Data Migration. Must be a Mentor and Guide to the Team on Data Migration Framework and Best Practices. Must be Experienced in Data Profiling and Data Quality Must have Good Experience in Implementing Data Validation Rules. Must have good hands on experience in writing SQL or similar type of query languages. Should have good exposure to different datastores eg: Application level or RDBMS databases. Analytical and Communication skills Understanding of SAP implementation methodologies Project and talent management Experience with project management Experience and desire to work in a management consulting environment About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is—and feels—included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we’re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer – By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions – while being inclusive of all people. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelor's degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 3years’ experience in software engineering or related area at a technology, retail, or data-driven company. Option 2: 5 years’ experience in software engineering or related area at a technology, retail, or data-driven company. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Certification in Security+, Network+, GISF, or GSEC Information Technology - CISCO Certification - Certification Primary Location... Pardhanani Wilshire Ii, Cessna Business Park, Kadubeesanahalli Village, Varthur Hobli , India R-2236126
Posted 2 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Parallel Wireless is leading the OpenRAN movement with the world’s first 5G/4G/3G/2G cloud-native OpenRAN architecture that is open, standardized, and interoperable across five key domains – RAN, Edge, Core, Orchestration and Analytics. What You Need – Developer with hand-on experience in cloud native solutions. Good programming skills (C/C++/Python/Golang) Experience in data monitoring solutions. Preferably having experience on Elastic stack. Knowledge on Networking Concepts is a plus. What You Will Do - Set up, configure, and optimize Elasticsearch architecture for performance and scalability. Develop scripts and automation tools (Python, Bash) for efficient management and configuration of Elasticsearch. Implement and enforce security best practices for Elasticsearch clusters. Utilize Elasticsearch Query DSL and mapping to optimize search performance and data indexing. Design and create advanced visualizations and dashboards in Kibana to provide insights into data. Establish monitoring and alerting systems in the ELK stack to ensure system health and performance. Configure and optimize APM (Application Performance Monitoring) setups. Integrate logs into ELK for comprehensive log management and analysis. Design logging frameworks for applications to ensure consistent and structured logging practices. ₹0 - ₹0 a year
Posted 2 days ago
3.0 years
5 - 40 Lacs
Chennai, Tamil Nadu, India
On-site
Location: Bangalore, Pune, Chennai, Kolkata ,Gurugram Experience:5-15 Years Work Mode: Hybrid Mandatory Skills: Snowflake/Azure Data Factory/ PySpark / Databricks/Snow pipe Good to Have-Snowpro Certification Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF. Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussions with client architect and team members Orchestrate the data pipelines in the scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: pipelines,pyspark,pl/sql,snowflake,reporting,snowpipe,databricks,spark,azure data factory,projects,data warehouse,unix shell scripting,snowsql,troubleshooting,rdbms,sql,data management principles,data,query optimization,snow pipe,skills,azure datafactory,nosql databases,circleci,git,terraform,snowflake utilities,performance tuning,etl,azure,architect
Posted 2 days ago
2.0 years
5 - 40 Lacs
Chennai, Tamil Nadu, India
On-site
Location: Bangalore, Pune, Chennai, Kolkata ,Gurugram Experience:5-15 Years Work Mode: Hybrid Mandatory Skills:Snowflake,sql queries,snowpipe,snowsql,dbt,Architect Overview Ensure new features and subject areas are modelled to integrate with existing structures and provide a consistent view. Develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audiences. Provide direction on adoption of Cloud technologies (Snowflake) and industry best practices in the field of data warehouse architecture and modelling. Providing technical leadership to large enterprise scale projects. You will also be responsible for preparing estimates and defining technical solutions to proposals (RFPs). This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project Roles & Responsibilities Must have total 5+ yrs. in IT and 2+ years' experience working as a snowflake Data Architect and 4+ years in Data warehouse, ETL, BI projects. Must have experience at least two end to end implementation of Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle. Expertise in Snowflake – data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python Experience in Data Migration from RDBMS to Snowflake cloud data warehouse Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) Experience with data security and data access controls and design Experience with AWS or Azure data storage and management technologies such as S3 and ADLS Build processes supporting data transformation, data structures, metadata, dependency and workload management Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface Must have expertise in AWS or Azure Platform as a Service (PAAS) Certified Snowflake cloud data warehouse Architect (Desirable) Should be able to troubleshoot problems across infrastructure, platform and application domains. Must have experience of Agile development methodologies Strong written communication skills. Is effective and persuasive in both written and oral communication Nice to have Skills/Qualifications:Bachelor's and/or master’s degree in computer science or equivalent experience. Strong communication, analytical and problem-solving skills with a high attention to detail. About You You are self-motivated, collaborative, eager to learn, and hands on You love trying out new apps, and find yourself coming up with ideas to improve them You stay ahead with all the latest trends and technologies You are particular about following industry best practices and have high standards regarding quality Skills: etl,rdbms,pl/sql,sql,python,data architecture,skills,snowsql,data modelling,azure,cloud,snowflake,communication,data,data warehouse,features,architecture,aws,bi projects,snowpipe,unix shell scripting,sql queries,dbt
Posted 2 days ago
2.0 - 4.0 years
0 Lacs
Calcutta
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Associate Job Description & Summary At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. In testing and quality assurance at PwC, you will focus on the process of evaluating a system or software application to identify any defects, errors, or gaps in its functionality. Working in this area, you will execute various test cases and scenarios to validate that the system meets the specified requirements and performs as expected. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within…. Responsibilities: * Strong understanding of Selenium WebDriver for browser automation *Knowledge of Selenium Grid for distributed test execution * Proficient in programming language used with Selenium preferably C#/Java *Experience with test automation frameworks like TestNG, JUnit (for Java), NUnit (for C#), or PyTest/UnitTest (for Python). *Understanding of HTML, CSS, and JavaScript to effectively work with web-based applications. *Familiarity with version control systems like Git for code management. *Experience with CI/CD tools like Jenkins, Bamboo, or GitLab CI to integrate automated tests into the development pipeline. *Familiarity with tools like Azure DevOps, JIRA etc. for defect tracking. *Understanding of RESTful services and experience with tools like Postman or RestAssured for API testing. *Basic SQL knowledge to perform database validations and queries for data verification *Experience in cross-browser testing and understanding of browser compatibility issues *Familiarity with performance/load testing and related tools e.g. JMeter, MS LoadTest *Exposure to Agile methodology Mandatory skill sets: Selenium WebDriver for browser automation Preferred skill sets: Basic SQL knowledge to perform database validations and queries for data verification Years of experience required: 2-4 Years Education qualification: B.Tech/B.E./MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Selenium Webdriver Optional Skills Structured Query Language (SQL) Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 2 days ago
5.0 - 8.0 years
0 Lacs
Andhra Pradesh
Remote
Software Engineering Lead Analyst (A) – HIH – Evernorth About Evernorth: Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Responsibilities: Design and implement the software in for provider experience group on various initiatives. Provide support to our end-users by resolving their issues, responding to queries, and helping them analyze/interpret the results from the models. Develop, code, and unit test with variety of cloud services and infrastructure code using Terraform, build ETL using Python / PySpark and testing automation pipeline. Participate in peer code reviews. Develop reusable infrastructure code for commonly occurring work across multiple processes and services. Participate in planning and technical design discussions with other developers, managers, and architects to meet application requirements and performance goals. Manage the Pipeline using JENKINS to move the application to higher environments such as System Testing, User Acceptance Testing, Release Testing, and Users Training environments. Contribute to production support to resolve application production issues. Follow the guidelines of Cloud COE and other teams for production deployment and maintenance activities for all applications running in AWS. Manage the application demos to business users and Product Owners regularly in Sprint and PI demos. Work with Business users and Product Owners to understand business requirements. Participate in Program Increment (PI) planning and user stories grooming with Scrum masters, developers, QA Analysts, and product owners. Participate in daily stand-up meetings to provide daily work status updates to the Scrum master and product owner, following Agile Methodology. Write Structured Query Language (SQL) stored procedures and SQL queries for create, read, update, and delete (CRUD) operations for database. Write and maintain technical and design documents. Understand best practices for using the Guarantee Management’s tools and applications. Required Skills: Excellent debugging, analytical, and problem-solving skills. Excellent communication skills. Required Experience & Education: Bachelor’s in computer science or related field, or equivalent relevant work experience and technical knowledge. 5-8 years of total related experience. Experience Full Stack Python/PySpark Developer and Hands-on experience on AWS Cloud Services. Experienced in software development in Java and open-source tech stack. Strong and Proficient in React or Angular AND NodeJS client-side languages and frameworks. Hands on Experience in AWS Cloud Development. Experience in CI/CD tools such as AWS Cloudformation, Jenkins, Conduits, GitHub. Experience in Microservice Architecture. Exposure to SOLID, Architectural Patterns, Development Best Practices. Experience in Unit Testing automation, Test Driven Development and use of mocking frameworks. Experience working in Agile/Scrum teams. Hands on experience in infrastructure as a code in a Terraform. SQL and NoSQL experience Desired Experience: Experience building in Event Driven Architecture a plus. Security Engineering or Knowledge of AWS IAM Principles a plus Kafka knowledge a plus. NoSQL Solutions a plus. Location & Hours of Work: Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WFH) About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.
Posted 2 days ago
5.0 - 8.0 years
0 Lacs
Andhra Pradesh
On-site
Automation Engineer – Lead Analyst Position Overview Evernorth Core Platform Engineering team is looking for Automation Engineer Lead Analyst. The role of the Automation Engineer Analyst will play a pivotal role in system development cycle of Web & Mobile applications by understanding overall architecture and workflows required to formulate test strategies specifically on the Packaged Business Capabilities (PBCs). As a member of our team, you will work in a high performance, high frequency, enterprise technology environment. This role will work with all levels of business, ensuring the deliverables align with business requirements with measurable results and will coordinate the work of project teams along multiple workstreams like data partners, integrated system, external vendors, etc. Responsibilities Coordinates the work of project teams along multiple workstreams like data partners, integrated system, external vendors, etc. Query and analyzes data to verify results, troubleshoot production issues, and enhance the test automation suites Crafts automated tests to meet speed to market goal ensuring quality Troubleshoots and optimizes automated tests and supporting artifacts to execute automatically in CI/CD pipelines, reduce cycle time Escalates risk and issues timeously to enable the effective planning and communication to stakeholders Embraces the mindset of fearlessly engaging in manual hands-on and exploratory testing whenever circumstances demand it Responsible for identifying, documenting, and effectively resolving defects through meticulous reporting and tracking Adheres to the organization's Quality Engineering best practices while helping to drive changes to our testing practices where necessary Enhance the Automation Platform based on the needs. Qualifications A proven track record of 5 to 8 years in successfully testing and ensuring the quality of web and mobile applications. Proficient in conducting thorough business requirements analysis, designing efficient test automation suites, and diligently logging and tracking defects throughout the testing process Expertise in applying agile methodologies and principles to software testing, ensuring efficient and effective testing practices throughout the development lifecycle Demonstrated proficiency in utilizing test management tools such as Jira to effectively plan, track, and manage testing activities, ensuring seamless collaboration, and streamlined workflows A strong foundation and practical experience in programming languages are essential, with a particular emphasis on JavaScript or TypeScript which are highly preferred Proven experience in automating Web UI and API testing using Cypress, ensuring robust and reliable test coverage for both front-end and back-end functionalities Good experience in Mobile Automation using WebdriverIO across various mobile testing platforms and devices Good experience in Git based source control tools like GitLab, GitHub, Bitbucket Experience with SQL and database/backend testing Hands-on experience with API testing tools such as Postman and SoapUI Required Experience & Education: Bachelor's degree in computer science or information technology or related fields Healthcare domain knowledge Mobile Automation experience Knowledge in JavaScript and TypeScript programming languages Exposure to AWS services (DynamoDB, S3 Buckets, Lambdas, etc.) Excellent written and verbal communication skills Solid analytical skills, highly organized, self-motivated and a quick learner Flexible and willing to accept change in priorities as necessary About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.
Posted 2 days ago
3.0 - 5.0 years
0 Lacs
Andhra Pradesh
On-site
Software Engineering Senior Analyst - HIH - Evernorth About Evernorth: Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Software Engineering Senior Analyst Position Overview: Software Engineer supporting Cigna's Provider Technology organization. Responsibilities: Design and implement the software in for provider experience group on various initiatives. Provide support to our end-users by resolving their issues, responding to queries, and helping them analyze/interpret the results from the models. Develop, code, and unit test with variety of cloud services and infrastructure code using Terraform, build ETL using Python / PySpark and testing automation pipeline. Participate in peer code reviews. Develop reusable infrastructure code for commonly occurring work across multiple processes and services. Participate in planning and technical design discussions with other developers, managers, and architects to meet application requirements and performance goals. Manage the Pipeline using JENKINS to move the application to higher environments such as System Testing, User Acceptance Testing, Release Testing, and Users Training environments. Contribute to production support to resolve application production issues. Follow the guidelines of Cloud COE and other teams for production deployment and maintenance activities for all applications running in AWS. Manage the application demos to business users and Product Owners regularly in Sprint and PI demos. Work with Business users and Product Owners to understand business requirements. Participate in Program Increment (PI) planning and user stories grooming with Scrum masters, developers, QA Analysts, and product owners. Participate in daily stand-up meetings to provide daily work status updates to the Scrum master and product owner, following Agile Methodology. Write Structured Query Language ( SQL ) stored procedures and SQL queries for create, read, update, and delete (CRUD) operations for database. Write and maintain technical and design documents. Understand best practices for using the Guarantee Management’s tools and applications. Required Skills: Excellent debugging, analytical, and problem-solving skills. Excellent communication skills. Required Experience & Education: Bachelor's in computer science or related field, or equivalent relevant work experience and technical knowledge. 3-5 years of total related experience. Experience in Full Stack Python / PySpark Developer and Hands-on experience on AWS Cloud Services.. Hands on Experience in AWS Cloud Development. Experience in CI/CD tools such as AWS CloudFormation, Jenkins, Conduits, GitHub. Experience in Microservice Architecture. Exposure to SOLID, Architectural Patterns, Development Best Practices. Experience in Unit Testing automation, Test Driven Development, and use of mocking frameworks. Experience working in Agile/Scrum teams. Hands on experience in infrastructure as a code in a Terraform. Desired Experience: Experience building in Event Driven Architecture a plus. Security Engineering or Knowledge of AWS IAM Principles a plus Kafka knowledge a plus. NoSQL Solutions a plus. Good to have Databricks experience Application development is also needed Experienced in software development in Java and open source tech stack. Strong and Proficient in React or NodeJS client-side languages and frameworks Equal Opportunity Statement: Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.
Posted 2 days ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Maintaining high accuracy, productivity standards and should have willingness to learn. GDS Knowledge. Will have Interaction with clients through email for query resolution and should build and maintain good working relationships with customers. Participate in process improvement initiatives, be flexible and change ready in a dynamic work environment.
Posted 2 days ago
2.0 years
5 - 40 Lacs
Greater Kolkata Area
On-site
Location: Bangalore, Pune, Chennai, Kolkata ,Gurugram Experience:5-15 Years Work Mode: Hybrid Mandatory Skills:Snowflake,sql queries,snowpipe,snowsql,dbt,Architect Overview Ensure new features and subject areas are modelled to integrate with existing structures and provide a consistent view. Develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audiences. Provide direction on adoption of Cloud technologies (Snowflake) and industry best practices in the field of data warehouse architecture and modelling. Providing technical leadership to large enterprise scale projects. You will also be responsible for preparing estimates and defining technical solutions to proposals (RFPs). This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project Roles & Responsibilities Must have total 5+ yrs. in IT and 2+ years' experience working as a snowflake Data Architect and 4+ years in Data warehouse, ETL, BI projects. Must have experience at least two end to end implementation of Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle. Expertise in Snowflake – data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python Experience in Data Migration from RDBMS to Snowflake cloud data warehouse Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) Experience with data security and data access controls and design Experience with AWS or Azure data storage and management technologies such as S3 and ADLS Build processes supporting data transformation, data structures, metadata, dependency and workload management Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface Must have expertise in AWS or Azure Platform as a Service (PAAS) Certified Snowflake cloud data warehouse Architect (Desirable) Should be able to troubleshoot problems across infrastructure, platform and application domains. Must have experience of Agile development methodologies Strong written communication skills. Is effective and persuasive in both written and oral communication Nice to have Skills/Qualifications:Bachelor's and/or master’s degree in computer science or equivalent experience. Strong communication, analytical and problem-solving skills with a high attention to detail. About You You are self-motivated, collaborative, eager to learn, and hands on You love trying out new apps, and find yourself coming up with ideas to improve them You stay ahead with all the latest trends and technologies You are particular about following industry best practices and have high standards regarding quality Skills: etl,rdbms,pl/sql,sql,python,data architecture,skills,snowsql,data modelling,azure,cloud,snowflake,communication,data,data warehouse,features,architecture,aws,bi projects,snowpipe,unix shell scripting,sql queries,dbt
Posted 2 days ago
3.0 years
5 - 40 Lacs
Greater Kolkata Area
On-site
Location: Bangalore, Pune, Chennai, Kolkata ,Gurugram Experience:5-15 Years Work Mode: Hybrid Mandatory Skills: Snowflake/Azure Data Factory/ PySpark / Databricks/Snow pipe Good to Have-Snowpro Certification Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF. Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussions with client architect and team members Orchestrate the data pipelines in the scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: pipelines,pyspark,pl/sql,snowflake,reporting,snowpipe,databricks,spark,azure data factory,projects,data warehouse,unix shell scripting,snowsql,troubleshooting,rdbms,sql,data management principles,data,query optimization,snow pipe,skills,azure datafactory,nosql databases,circleci,git,terraform,snowflake utilities,performance tuning,etl,azure,architect
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
Goa Velha, Goa, India
On-site
Talent Worx is on the lookout for a skilled GIS Developer who is passionate about geospatial technologies and eager to make a significant impact through innovative mapping solutions. In this role, you will design, develop, and maintain GIS applications, ensuring that data visualization and analysis meet client needs effectively. Exp- 4 to 8 years Location- Bangalore, Coimbatore, Delhi NCR, Mumbai Requirements Key Responsibilities: Design, develop, and maintain GIS applications using industry-standard technologies and tools Implement spatial data models and integrate geospatial data from various sources Utilize GIS APIs and libraries (like ArcGIS, QGIS, or Leaflet) to create interactive maps and visualizations Perform spatial analysis and support data analysis projects Collaborate with cross-functional teams to gather requirements and align GIS solutions with business objectives Optimize GIS database performance and ensure data accuracy and integrity Provide support and training to colleagues on GIS tools and applications Stay current with geospatial technologies and best practices in GIS development Technical Skills: Proficiency in GIS software and tools (ArcGIS, QGIS, etc.) Experience with GIS programming languages (Python, JavaScript, or similar) Familiarity with spatial databases (PostGIS, Spatialite) and query languages Knowledge of web mapping technologies (Leaflet, OpenLayers, or Google Maps API) Strong understanding of geospatial data formats (Shapefiles, GeoJSON, KML, etc.) Experience with data visualization tools is a plus Required Qualifications: Bachelor's degree in Geographical Information Systems, Computer Science, or related field Minimum of 3 years of experience in GIS development or a related role Strong analytical skills and attention to detail Excellent communication skills and the ability to work collaboratively Benefits Work with one of the Big 4's in India Healthy work Environment Work Life Balance
Posted 2 days ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Overview Cvent is a leading meetings, events, and hospitality technology provider with more than 4,800 employees and ~22,000 customers worldwide, including 53% of the Fortune 500. Founded in 1999, Cvent delivers a comprehensive event marketing and management platform for marketers and event professionals and offers software solutions to hotels, special event venues and destinations to help them grow their group/MICE and corporate travel business. Our technology brings millions of people together at events around the world. In short, we’re transforming the meetings and events industry through innovative technology that powers the human connection. The DNA of Cvent is our people, and our culture has an emphasis on fostering intrapreneurship - a system that encourages Cventers to think and act like individual entrepreneurs and empowers them to take action, embrace risk, and make decisions as if they had founded the company themselves. At Cvent, we value the diverse perspectives that each individual brings. Whether working with a team of colleagues or with clients, we ensure that we foster a culture that celebrates differences and builds on shared connections. In This Role, You Will Work with a talented group of data scientists, software developers and product owners to identify possible applications for AI and machine learning. Understand Cvent's product lines, their business models and the data collected. Perform machine learning research on various types of data ( numeric, text, audio, video, image). Deliver machine learning models in a state that can be picked up by a software development team and operationalized via Cvent's products. Thoroughly document your work as well as results, to the extent that another data scientist can replicate them. Progressively enhance your skills in machine learning, writing production-quality Python code, and communication. Here's What You Need A Bachelor's degree in a quantitative field (natural sciences, math, statistics, computer science). At least 3 years of experience working as a data scientist in industry. In-depth familiarity with the Linux operating system and command-line work. Conceptual and technical understanding of machine learning, including model training and evaluation. Experience with formal Python coding. Proficiency in machine learning packages in Python. Familiarity with Generative AI based system development. Experience with relational databases and query-writing in SQL. Knowledge of linear algebra and statistics. Skills in data exploration and interpretation. It Will Be Excellent If You Also Have A Master's or PhD degree in a quantitative field. Experience with Databricks and/or Snowflake platforms. Ability to write production-quality Python code with testing coverage. Experience working on a cloud platform (AWS/Azure/Google Cloud), especially machine learning R&D on a cloud platform. Knowledge of the software development lifecycle, including Git processes, code reviews, test-driven development and CI/CD. Experience with A/B testing. Skills in data visualization and dashboarding. Knowledge of how to interact with a REST API. Proven ability for proactive, independent work with some supervision. Strong verbal and written communication.
Posted 2 days ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About the Company: We are one of India’s premier integrated political consulting firms specializing in building data-driven 360-degree election campaigns. We help our clients with strategic advice and implementation which brings together data-backed insights and in-depth ground intelligence into a holistic electoral campaign. We are passionate about our democracy and the politics that shape the world around us. We draw on some of the sharpest minds from distinguished institutions and diverse professional backgrounds to help us achieve our goal. The team brings in 7 years of experience in building electoral strategies that spark conversations, effect change, and help shape electoral and legislative ecosystems in our country. Job Summary: We are looking for a motivated and detail-oriented Statistics Intern to join our team. This internship offers an excellent opportunity to apply academic knowledge of statistics and data analysis in a real-world setting. The intern will assist in data cleaning, statistical modeling, visualization, and research reporting across various projects. Key Responsibilities: 1.Assist in collecting, organizing, and cleaning datasets for analysis 2.Conduct basic statistical analyses (e.g., descriptive stats, cross-tabulations, hypothesis tests) 3.Support the development of charts, graphs, and summary reports 4.Help build and/or validate statistical models under supervision (e.g., regression, classification) 5.Collaborate with team members to interpret results and draw meaningful insights 6.Document methods and maintain organized records of code and findings. Required/Minimum Qualifications: 1.Currently pursuing(Masters) or recently completed a Bachelor’s degree in Statistics, Mathematics, Economics, Data Science, or a related field 2.Basic understanding of statistical concepts(Probability Statistics, Bayesian Inference, Hypothesis testing etc.)and data structures- query writing skills and data automation. 3.Familiarity with statistical software such as R,SPSS, Stata etc. 4.Working knowledge/Demonstrated ability to code using numerical/statistical/MLlibraries(NumPy, Statsmodel, Pandas etc.) Python is a must. 5.Ability to work with datasets, conduct exploratory data analysis, and interpret output 6.Strong attention to detail and problem-solving abilities 7.Good written and verbal communication skills Good to have Skills: 1.Experience with data visualization tools or packages (e.g., ggplot2, matplotlib, Tableau) 2.Knowledge of survey data, experimental design, or basic machine learning techniques such as KNN and NLM’s. 3.Ability to write clean, reproducible code (e.g., using data automation tools in Excel such as VBA or python scripts.) Location: BLR - 4th Floor, VK Kalyani Commercial Complex, Opp to BDA Sanky Road, Bangalore, 560021
Posted 2 days ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
We are looking for an experienced Power BI Full Stack Developer with strong expertise across the Microsoft BI stack. The candidate will be responsible for delivering end-to-end data visualization solutions, leveraging tools like Power BI, SQL Server, SSIS, and Azure services within a structured SDLC. Key Responsibilities: Work with stakeholders to gather business reporting and dashboarding requirements Design and develop end-to-end Power BI solutions, including datasets, dataflows, and reports Build optimized data models (star/snowflake) and implement advanced DAX measures Prepare and transform data using Power Query (M language) Write and maintain complex SQL queries, stored procedures, and views in SQL Server Integrate data from Microsoft stack components – SQL Server, Excel, SSAS, SSIS, ADF, Azure SQL Deploy and schedule reports in Power BI Service, manage workspace access and RLS Collaborate with backend, ETL, and DevOps teams for seamless data integration and deployment Apply performance tuning and implement best practices for Power BI governance Prepare technical documentation and support testing, deployment, and production maintenance Required Skills: Hands-on experience with Power BI Desktop, Power BI Service, and DAX Strong SQL Server experience (T-SQL, views, stored procedures, indexing) Proficiency in Power Query, data modeling, and data transformation Experience with Microsoft BI stack: SSIS (ETL workflows) SSAS (Tabular Model preferred) Azure SQL Database, Azure Data Factory (ADF) Understanding of CI/CD for BI artifacts and use of DevOps tools (e.g., Azure DevOps, Git) Preferred Qualifications: Experience in banking or financial services domain Familiarity with enterprise data warehouse concepts and reporting frameworks Strong problem-solving skills and ability to present insights to business users
Posted 2 days ago
0.0 - 2.0 years
5 - 8 Lacs
Gurugram, Haryana
On-site
Job description Job Title: Full Stack Developer (Experience 3-5 Years). Location: [Gurugram, Haryana] Job Type: [Full-Time] Department: Engineering / Development Job Summary: We are looking for a talented and motivated Full Stack Developer to join our development team. The ideal candidate has strong experience building scalable web applications and services across the full technology stack. You will be responsible for both front-end and back-end development, and will collaborate closely with designers, product managers, and other developers to deliver high-quality software solutions. Key Responsibilities: Design, develop, and maintain scalable front-end and back-end applications. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code using best practices. Ensure the responsiveness, security, and performance of applications. Build and maintain APIs, microservices, and integrations with external systems. Perform code reviews, unit testing, and debugging to ensure code quality. Continuously discover, evaluate, and implement new technologies. Participate in agile development processes including sprint planning and standups. Troubleshoot, debug, and upgrade existing software. Document software functionality and development processes. Required Qualifications: Bachelor’s degree in computer science, Engineering, or related field (or equivalent experience). Proven experience as a Full Stack Developer or similar role. Proficiency in front-end technologies ( JavaScript, React, Next Js, Tailwind CSS, Frontend optimization, Tanstack Query, Axios, State Management Angular, or , React Native, Typescript). Knowledge of Maps and Integration. Strong experience with back-end languages and frameworks (e.g., Node.js, Express, Python, Java, SpringBoot). Familiarity with database technologies (SQL and NoSQL – e.g., MySQL, PostgreSQL, MongoDB, ORM Tools like Prisma, Drizzle, ODM-Mongoose). Experience with RESTful APIs, GraphQL, and web services integration. Version control experience (e.g., Git, GitHub, GitLab). Solid understanding of CI/CD pipelines and DevOps practices.(e.g Deployment to cloud servers, like Azure, Aws). Understanding security and data protection is the best practice.(e.g JWT, RBAC, OAuth). Job Types: Full-time, Permanent Pay: ₹500,000.00 - ₹800,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Monday to Friday Morning shift Experience: Full-stack development: 2 years (Preferred) Work Location: In person
Posted 2 days ago
4.0 years
0 Lacs
Defence Colony, Delhi, India
On-site
Organization Name- Legitquest (LQ Global Services Pvt. Ltd.) Experience- 4+ Years Compensation range- INR 9,00,000 to INR 10,00,000 Location of posting- Delhi Company Profile Legitquest is a GenAI platform designed for law firms, corporations, and government institutions, augmenting productivity and automating complex workflows. Utilizing advanced algorithms and reasoning-capable Large Language Models (LLMs), which have been meticulously customized and developed by our team of legal experts, engineers, and researchers, Legitquest delivers cutting-edge solutions tailored to the legal sector's needs. To achieve a strong product-market fit, we are rapidly scaling our team to meet the growing demand and continue advancing our innovative offerings. The venture is backed by Water Bridge and Info Edge. About the Role We’re hiring a Software Developer with strong experience in C#, .NET (Core/Framework) , MySQL , and Elasticsearch . You'll be working on our core document search infrastructure—enhancing the speed, accuracy, and performance of legal data search across millions of documents. Required Skills C# and .NET Framework/Core – 4+ years of experience MySQL – Schema design, query optimization, and indexing Elasticsearch – Deep understanding of architecture, indexing, querying, and performance tuning Clean, scalable, maintainable code practices Familiarity with version control systems (e.g., Git) Solid debugging, problem-solving, and troubleshooting skills Roles & Responsibilities Maintain and enhance existing C# scripts for Elasticsearch integrations Design and manage Elasticsearch indices and queries to deliver fast, accurate search results Collaborate with frontend, product, and legal data teams to understand user needs Ensure performance, security, and scalability benchmarks are met Write efficient and reusable code Investigate and resolve production issues with long-term solutions Preferred Qualifications Experience working in legal-tech or with document-heavy systems Familiarity with microservices architecture Exposure to containerization (Docker/Kubernetes) is a plus Why Join Legitquest? Be part of a product team redefining how legal professionals work Solve meaningful, large-scale data problems Work closely with cross-functional experts in tech, product, and law Opportunity to contribute to core system design decisions Collaborative and learning-focused environment Contact Information Contact Person Tanu Singh Contact 85888 58429 e-mail ID career@legitquest.com Website www.legitquest.com
Posted 2 days ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Cloud Data Engineer (AWS) To qualify for the role, you must have Excellent academic background, including at a minimum a bachelor/master’s degree in, Engineering, Computer Science, or other related field with strong quantitative focus. 6+ years of working experience with particular focus on the following: Collaborates with Business team to improve data models as per Business Intelligence reporting tools by increasing data accessibility and fostering data driven decision making across organization. Build Data Pipelines that ingest, clean, transform and aggregate data from disparate sources using AWS Data Flow. Amazon S3: Data storage, partitioning, and lifecycle policies. AWS Glue: ETL development, Glue Studio, and Glue Data Catalog. Amazon Redshift: Data warehousing, Redshift Spectrum, and performance tuning. Amazon Athena: Serverless querying of S3 data using SQL. Amazon Kinesis / MSK: Real-time data streaming and ingestion. AWS Lambda: Serverless data processing and orchestration. AWS Step Functions: Workflow orchestration for data pipelines. IAM: Fine-grained access control and security best practices. Snowflake SQL: Advanced querying, window functions, and stored procedures. Data Loading/Unloading: Using COPY INTO from S3 and external stages. Snowpipe: Real-time data ingestion from AWS S3. Streams & Tasks: For change data capture (CDC) and automation. Time Travel & Cloning: For data recovery and testing. Performance Tuning: Query profiling, clustering keys, and warehouse sizing. Build reliable, highly scalable, and highly performing applications for data analytics on AWS Develop data strategy for long term data platform architecture by working closely with Business units. Good to have Software development experiences in Python Angular Front end Skills Experience in Advanced Data Visualization tools, such as Tableau, Power BI for integration between disparate data sources, design and implementation of KPIs and generation of automatic and scalable visualizations that will facilitate extraction of business insights. SoftSkills Strong communication, presentation, client service and technical writing skills in English for both technical and business audiences. Strong analytical, problem solving and critical thinking skills. Ability to work under tight timelines for multiple project deliveries. Ability/flexibility to travel and work abroad for international projects. Roles and Responsibilities: Understand existing application architecture and solution design Design individual components and develop the components Work with other architects, team members in an agile scrum environment Hands on development Design and develop applications that can be hosted on Azure or AWS cloud Design and develop framework and core functionality Research and Innovate new concepts in Data engineering on Azure or AWS Understand enterprise application design framework and processes Review code and establish best practices Look out for latest technologies and match up with EY use case and solve business problems efficiently EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 2 days ago
7.0 years
25 - 35 Lacs
Hyderabad, Telangana, India
Remote
We are seeking a highly skilled Lead Data Engineer/Associate Architect to lead the design, implementation, and optimization of scalable data architectures. The ideal candidate will have a deep understanding of data modeling, ETL processes, cloud data solutions, and big data technologies. You will work closely with cross-functional teams to build robust, high-performance data pipelines and infrastructure to enable data-driven decision-making. Experience: 7 - 12 years Work Location: Hyderabad (Hybrid) / Remote Mandatory skills: Python, SQL, Snowflake Responsibilities Design and Develop scalable and resilient data architectures that support business needs, analytics, and AI/ML workloads. Data Pipeline Development: Design and implement robust ETL/ELT processes to ensure efficient data ingestion, transformation, and storage. Big Data & Cloud Solutions: Architect data solutions using cloud platforms like AWS, Azure, or GCP, leveraging services such as Snowflake, Redshift, BigQuery, and Databricks. Database Optimization: Ensure performance tuning, indexing strategies, and query optimization for relational and NoSQL databases. Data Governance & Security: Implement best practices for data quality, metadata management, compliance (GDPR, CCPA), and security. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to translate business requirements into scalable solutions. Technology Evaluation: Stay updated with emerging trends, assess new tools and frameworks, and drive innovation in data engineering. Required Skills Education: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Experience: 7 - 12+ years of experience in data engineering Cloud Platforms: Strong expertise in AWS/Azure data services. Databases: Hands-on experience with SQL, NoSQL, and columnar databases such as PostgreSQL, MongoDB, Cassandra, and Snowflake. Programming: Proficiency in Python, Scala, or Java for data processing and automation. ETL Tools: Experience with tools like Apache Airflow, Talend, DBT, or Informatica. Machine Learning & AI Integration (Preferred): Understanding of how to architect data solutions for AI/ML applications Skills: scala,etl,java,mongodb,sql,python,talend,gcp,data engineering,elt,azure,cloud,snowflake,postgresql,aws,apache airflow,cassandra,informatica,dbt
Posted 2 days ago
25.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Us Toluna is the global research and insights leader that enables businesses to make smarter, data-driven decisions – faster. For 25 years, we have partnered with the world’s leading brands, delivering transformative impact through our advanced technology platform, comprehensive solution portfolio, expansive global first-party panel, and world-class team of leading research experts. Since 2019, we’ve made significant investments in artificial intelligence to enhance automation, accelerate insight delivery, and unlock deeper understanding at scale. With 40+ offices worldwide, Toluna operates in 70+ countries, redefining the future of insights. Learn more at www.tolunacorporate.com About the role: We are seeking an experienced Power BI Developer with a strong background in finance to join our team. As a Power BI Developer, you will leverage your expertise in both data analytics and finance to design, develop, and maintain interactive dashboards and reports that provide key insights into the financial performance and trends of the organization. This role will be reporting to Senior Analytics & Modelling and work closely with business stakeholders, finance teams, and other departments to ensure that the reporting meets the needs of the organization while delivering actionable financial insights. Key Responsibilities: Develop Power BI Reports/Dashboards: Design, develop, and maintain interactive Power BI dashboards and reports, focusing on key financial metrics, budgeting, forecasting, and profitability analysis. Data Modelling & ETL: Create and manage data models in Power BI, ensuring the accuracy, quality, and integrity of financial data. Integrate and transform data from multiple financial systems and sources into Power BI using Power Query and DAX. Financial Analysis & Visualization: Work with finance teams to understand financial data needs and provide actionable insights into cash flow, balance sheets, P&L, variance analysis, and other key financial indicators. Collaborate with Stakeholders: Act as the bridge between IT, business, and finance departments to ensure that dashboards and reports are aligned with organizational goals and deliver key financial insights. Optimization: Continuously optimize reports and dashboards for performance and usability, ensuring end-users can interact with reports in an intuitive way. Troubleshooting and Support: Provide ongoing support and troubleshooting for Power BI reports, resolving issues related to data accuracy, report design, or performance. Training and Documentation: Provide training to finance and non-finance staff on how to use Power BI reports and dashboards effectively. Create clear documentation for report usage and data sources. Education Requirements: Bachelor’s degree in Finance, Accounting, Computer Science, Data Analytics, or related field. Microsoft Certified: Data Analyst Associate or similar Power BI certification is desired. Experience: Minimum of 3 years of experience in Power BI development, with a strong focus on finance-related reports and analysis. Hands-on experience in financial data modelling, budgeting, forecasting, and reporting. Proficiency in Power BI Desktop, Power BI Service, and DAX (Data Analysis Expressions). Experience with ETL processes, including integrating data from multiple sources (SQL, Excel, Salesforce, etc.). Familiarity with financial metrics and KPIs. Required Technical Skills Proficient in Power BI, including report creation, DAX, Power Query, and data visualization best practices. Strong knowledge of SQL for data extraction and manipulation. Advanced use of Excel skills such as complex formulas. Desirable Technical Skills Experience with cloud platforms such as Microsoft Fabric. Familiarity with finance tools and software such as SAP. Familiarity with other BI tools such as QlikView and Anaplan. Familiarity with advanced analytics tools (Python, R) or machine learning algorithms related to finance. Soft Skills Strong analytical skills with the ability to transform complex financial data into clear, actionable insights. Excellent communication skills and the ability to present financial information to both technical and non-technical stakeholders. Strong attention to detail and problem-solving ability. Ability to work independently and as part of a team in a fast-paced environment.
Posted 2 days ago
2.0 years
0 Lacs
India
On-site
Description Role Description: Business Analyst position is focused on developing analytical solutions Responsible for managing large datasets and providing business insights Transforms data into actionable business information Designs and implements analytics solutions Works with stakeholders worldwide to drive business impact Skills Of a Successful Business Analyst Strong SQL and data analytics capabilities Excellence in data visualization and dashboard creation Advanced Excel skills (VBA, pivot tables, power pivots) Strong stakeholder management abilities Problem-solving and analytical thinking Communication and presentation skills Ability to handle ambiguity Competencies Must Haves: Analytical experience Proficiency in SQL Strong data visualization skills Stakeholder management experience Excel expertise Good To Have Experience with large-scale data sets Knowledge of metadata modeling Pipeline monitoring experience Business domain knowledge Experience with multiple analytics tools Why Is This Role Difficult To Hire Rare combination of technical and business skills required Need for both analytical and communication capabilities High demand for experienced analysts in the market Complex technical requirements (SQL, data visualization, Excel) Competition from other tech companies (like Flipkart, Uber, Ola) Why Are Business Analysts Critical At Amazon Drive data-driven decision making Provide crucial insights into operations health Transform complex data into actionable business information Enable stakeholders worldwide to access and understand data Support business growth through analytical solutions Help identify trends, patterns, and improvement opportunities A Day In The Life Creating and managing datasets Building and maintaining dashboards Conducting deep dive analyses Meeting with stakeholders to gather requirements Monitoring data pipelines Optimizing query performance Creating visualizations and reports Presenting insights to business leaders Making data-driven recommendations Collaborating with technical teams Working on metadata modeling Managing competing priorities in an agile environment A day in the life Candidate is expected to have a good SQL and business acumen. Every day goes by requirement gathering , understanding the business problem, gaining vast knowledge on the amazon databases and interfaces, suggesting optimal solution to problem statement, qualify and quantifying the problem statement using insightful data points. Basic Qualifications Experience creating complex SQL queries joining multiple datasets, ETL DW concepts Experience defining requirements and using data and metrics to draw business insights Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages 2+ years of Excel or Tableau (data manipulation, macros, charts and pivot tables) experience Experience with SQL or ETL 2+ years of tax, finance or a related analytical field experience 2+ years in End-End handling of business analysis from requirement gathering to insightful metrics presentation Preferred Qualifications Experience working with Tableau Experience using very large datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ Job ID: A3032397
Posted 2 days ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Role Description Role Proficiency: Act creatively to develop applications by selecting appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions. Account for others' developmental activities; assisting Project Manager in day to day project execution. Outcomes Interpret the application feature and component designs to develop the same in accordance with specifications. Code debug test document and communicate product component and feature development stages. Validate results with user representatives integrating and commissions the overall solution. Select and create appropriate technical options for development such as reusing improving or reconfiguration of existing components while creating own solutions for new contexts Optimises efficiency cost and quality. Influence and improve customer satisfaction Influence and improve employee engagement within the project teams Set FAST goals for self/team; provide feedback to FAST goals of team members Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues Percent of voluntary attrition On time completion of mandatory compliance trainings Code Outputs Expected: Code as per the design Define coding standards templates and checklists Review code – for team and peers Documentation Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation Requirements test cases and results Configure Define and govern configuration management plan Ensure compliance from the team Test Review/Create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain Relevance Advise software developers on design and development of features and components with deeper understanding of the business problem being addressed for the client Learn more about the customer domain and identify opportunities to provide value addition to customers Complete relevant domain certifications Manage Project Support Project Manager with inputs for the projects Manage delivery of modules Manage complex user stories Manage Defects Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate Create and provide input for effort and size estimation and plan resources for projects Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release Execute and monitor release process Design Contribute to creation of design (HLD LLD SAD)/architecture for applications features business components and data models Interface With Customer Clarify requirements and provide guidance to Development Team Present design options to customers Conduct product demos Work closely with customer architects for finalizing design Manage Team Set FAST goals and provide feedback Understand aspirations of the team members and provide guidance opportunities etc Ensure team members are upskilled Ensure team is engaged in project Proactively identify attrition risks and work with BSE on retention measures Certifications Obtain relevant domain and technology certifications Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort resources required for developing / debugging features / components Perform and evaluate test in the customer or target environments Make quick decisions on technical/project related challenges Manage a team mentor and handle people related issues in team Have the ability to maintain high motivation levels and positive dynamics within the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback for team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers and answer customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning handling multiple tasks. Build confidence with customers by meeting the deliverables timely with a quality product. Estimate time and effort of resources required for developing / debugging features / components Knowledge Examples Appropriate software programs / modules Functional & technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Broad knowledge of customer domain and deep knowledge of sub domain where problem is solved Additional Comments Power BI Developer to design, develop, and maintain robust business intelligence solutions. The ideal candidate will have expertise across the Power BI platform, strong data modeling capabilities, and experience integrating diverse data sources to deliver insightful and performant reports and dashboards. Key Responsibilities: Develop, publish, and maintain interactive reports and dashboards using Power BI Desktop and Power BI Service. Design and implement semantic data models to enable efficient reporting and analytics. Write advanced DAX, SQL, and Power Query (M) queries for data transformation and analysis. Integrate and transform data from various sources, including SQL Server, Snowflake, SharePoint, and external APIs. Optimize reports and data models for performance, scalability, and security. Collaborate with business stakeholders to gather requirements and translate them into effective visualizations and dashboards. Plan report layouts and create wireframes for intuitive user experiences. Document BI solutions, data models, and development processes. Skills Power BI,SQL,DAX
Posted 2 days ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact Responsibilities Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Preferred Education Master's Degree Required Technical And Professional Expertise Tableau Desktop & Server SQL, Oracle & Hive, Communication Skills, Project Management Multitasking, Collaborative Skills Proven experience in developing and working Tableau driven dashboards, analytics. Ability to query and display large data sets while maximizing the performance of workbook. Ability to interpret technical or dashboard structure and translate complex business requirements to technical Preferred Technical And Professional Experience Tableau Desktop & Server SQL ,Oracle & Hive
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France