Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
Build your career in the Data, Analytics and Reporting Team, working within the world's most innovative bank that values creativity and excellence. As a Quant Analytics Analyst within the Data Analytics and Reporting Team (DART), you will be responsible for delivering Management Information System (MIS) solutions and supporting daily operations. Your key responsibilities will include supporting day-to-day operations/tasks related to a functional area or business partner, ensuring projects are completed according to established timelines, assembling data, building reports/dashboards, identifying risks and opportunities along with potential solutions to unlock value. To excel in this role, you should have professional experience in a combination of business and relevant MIS/technology/reporting experience. You should possess a certain level of understanding of business operations and procedures and the ability to connect them with business fundamentals. Additionally, you must have hands-on experience and knowledge of querying different databases and other source systems for data analysis required for reporting. Proficiency in creating reports/business intelligence solutions using tools such as Tableau, Cognos, Python, Alteryx, SAS, etc., is essential. Your general desire and aptitude to learn and adapt to new technologies, openness to different perspectives, and ability to anticipate and resolve customer and general issues with a sense of urgency are crucial for this role. Ideally, you should have prior experience in reporting and data analysis development with the ability to meet stringent deadlines. Proficiency in writing/understanding SQL (PL/SQL, T/SQL, PostgreSQL, or similar) and hands-on data analysis experience are also required. Preferred qualifications for this role include a Bachelor's degree or equivalent. Prior experience with call center technology data (Avaya CMS, IVR, Aspect, eWFM), Fraud Operations, CTO Operations, and other Consumer and Community Banking departments is desired. Experience in creating and deploying reports with a BI tool (such as Tableau, Microstrategy, Cognos, SSRS), sourcing and compiling data from a tool with ETL capabilities (such as SSIS, Alteryx, Trifacta, Abinitio, R, SAS), and knowledge of R/Python, Anaconda, HIVEQL, and exposure to Cloud Database will be advantageous for this role.,
Posted 3 days ago
4.0 - 8.0 years
8 - 13 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role Technology Lead No of years experience 5+ Detailed job description - Skill Set: Role Summary: As part of the offshore development team, the AWS Developers will be responsible for implementing ingestion and transformation pipelines using PySpark, orchestrating jobs via MWAA, and converting legacy Cloudera jobs to AWS-native services. Key Responsibilities: Write ingestion scripts (batch & stream) to migrate data from on-prem to S3. Translate existing HiveQL into SparkSQL/PySpark jobs. Configure MWAA DAGs to orchestrate job dependencies. Build Iceberg tables with appropriate partitioning and metadata handling. Validate job outputs and write unit tests. Required Skills: 35 years in data engineering, with strong exposure to AWS. Experience in EMR (Spark), S3, PySpark, SQL. Working knowledge of Cloudera/HDFS and legacy Hadoop pipelines. Prior experience with data lake/lakehouse implementations is a plus Mandatory Skills AWS Developer
Posted 1 month ago
8.0 - 12.0 years
12 - 18 Lacs
Bengaluru
Work from Office
As a Data Architect, you are required to: Design & develop technical solutions which combine disparate information to create meaningful insights for business, using Big-data architectures Build and analyze large, structured and unstructured databases based on scalable cloud infrastructures Develop prototypes and proof of concepts using multiple data-sources and big-data technologies Process, manage, extract and cleanse data to apply Data Analytics in a meaningful way Design and develop scalable end-to-end data pipelines for batch and stream processing Regularly scan the Data Analytics landscape to stay up to date with latest technologies, techniques, tools and methods in this field Stay curious and enthusiastic about using related technologies to solve problems and enthuse others to see the benefit in business domain Qualification : Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Engineering / Analytics is desirable. Experience level : Minimum 8 years in software development with at least 2 - 3 years hands-on experience in the area of Big-data / Data Engineering. Desired Knowledge & Experience: Data Engineer - Big Data Developer Spark: Spark 3.x, RDD/DataFrames/SQL, Batch/Structured Streaming Knowing Spark internals: Catalyst/Tungsten/Photon Databricks: Workflows, SQL Warehouses/Endpoints, DLT, Pipelines, Unity, Autoloader IDE: IntelliJ/Pycharm, Git, Azure Devops, Github Copilot Test: pytest, Great Expectations CI/CD Yaml Azure Pipelines, Continuous Delivery, Acceptance Testing Big Data Design: Lakehouse/Medallion Architecture, Parquet/Delta, Partitioning, Distribution, Data Skew, Compaction Languages: Python/Functional Programming (FP) SQL: TSQL/Spark SQL/HiveQL Storage: Data Lake and Big Data Storage Design Additionally it is helpful to know basics of: Data Pipelines: ADF/Synapse Pipelines/Oozie/Airflow Languages: Scala, Java NoSQL: Cosmos, Mongo, Cassandra Cubes: SSAS (ROLAP, HOLAP, MOLAP), AAS, Tabular Model SQL Server: TSQL, Stored Procedures Hadoop: HDInsight/MapReduce/HDFS/YARN/Oozie/Hive/HBase/Ambari/Ranger/Atlas/Kafka Data Catalog: Azure Purview, Apache Atlas, Informatica Big Data Architect Expert: in technologies, languages and methodologies mentioned in Data Engineer - Big Data Developer Mentor: mentors/educates Developers in technologies, languages and methodologies mentioned in Data Engineer - Big Data Developer Architecture Styles: Lakehouse, Lambda, Kappa, Delta, Data Lake, Data Mesh, Data Fabric, Data Warehouses (e.g. Data Vault) Application Architecture: Microservices, NoSql, Kubernetes, Cloud-native Experience: Many years of experience with all kinds of technology in the evolution of data platforms (Data Warehouse -> Hadoop -> Big Data -> Cloud -> Data Mesh) Certification: Architect certification (e.g. Siemens Certified Software Architect or iSAQB CPSA) Required Soft-skills & Other Capabilities: Excellent communication skills, in order to explain your work to people who don't understand the mechanics behind data analysis Great attention to detail and the ability to solve complex business problems Drive and the resilience to try new ideas, if the first ones don't work Good planning and organizational skills Collaborative approach to sharing ideas and finding solutions Ability to work independently and also in a global team environment.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France