On-site
Full Time
Role: Solution Architecture – Consultant II The Opportunity FICO is seeking highly skilled and motivated Data Integration and Data Architecture Engineers to join our Software division in Pittsburgh, PA. In this role, you will be instrumental in designing, developing, and implementing robust and scalable data integration pipelines and data architectures to support our next-generation software products. You will work with a variety of data sources, technologies, and platforms, ensuring efficient and reliable data flow for analytics, reporting, and operational purposes. A key aspect of this role will be to ultimately learn the FICO Platform and assist in developing and delivering demonstrations to clients showcasing data integration capabilities. Responsibilities: Design and implement end-to-end data integration solutions, encompassing both real-time and batch data ingestion processes. Develop and maintain data pipelines using various technologies and methodologies, including but not limited to streaming frameworks (e.g., Kafka, Spark Streaming), RESTful APIs, and GraphQL. Architect and model data solutions across a diverse landscape of relational (e.g., Teradata, Oracle, PostgreSQL, MySQL) and NoSQL (e.g., MongoDB, Cassandra, Redis, graph databases like Neo4j) data stores. Collaborate with cross-functional teams including product managers, data scientists, and other engineers to understand data requirements and deliver effective data solutions. Ensure data quality, integrity, and security throughout the data integration and storage processes. Optimize data architectures for performance, scalability, and maintainability. Troubleshoot and resolve data-related issues, ensuring timely and effective solutions. Stay current with the latest trends and technologies in data integration, data architecture, and data management. Contribute to the development of data standards, best practices, and documentation. Participate in code reviews and contribute to the continuous improvement of our engineering processes. Evaluate and recommend new data integration and data storage technologies. Ultimately learn the FICO Platform architecture and data integration capabilities. Assist in the development and delivery of technical demonstrations to clients, showcasing how various data sources can be integrated with the FICO Platform. Support the creation of documentation and technical content for client integration scenarios. Qualifications: Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related field. Proven experience (typically 3+ years for Engineer, 5+ years for Senior Engineer) in data integration and data architecture roles. Strong understanding of data integration patterns, ETL/ELT processes, and data warehousing concepts. Hands-on experience with real-time data ingestion using streaming technologies (e.g., Kafka, Spark Streaming, Flink). Experience developing and consuming RESTful APIs and/or GraphQL APIs for data exchange. Proficiency in working with various relational database management systems (RDBMS) such as Teradata, Oracle, SQL Server, PostgreSQL, and MySQL. Experience with NoSQL databases, including document stores (e.g., MongoDB), key-value stores (e.g., Redis), wide-column stores (e.g., Cassandra), and graph databases (e.g., Neo4j). Familiarity with cloud-based data platforms and technologies (e.g., Snowflake, Databricks, AWS, Azure, GCP data services). Strong SQL skills and experience in data manipulation and querying. Programming proficiency in one or more scripting or programming languages (e.g., Python, Java, Scala). Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in a fast-paced environment. Aptitude and willingness to learn new platforms and technologies, specifically the FICO Platform. Strong presentation and interpersonal skills to effectively communicate technical concepts during client demonstrations. Preferred Qualifications: Experience with data virtualization technologies. Knowledge of data governance and data quality frameworks. Experience with containerization technologies (e.g., Docker, Kubernetes). Familiarity with data science workflows and tools. Contributions to open-source data-related projects. Previous experience with enterprise software platforms.
FICO
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now3.0 - 7.0 Lacs P.A.
6.0 - 10.0 Lacs P.A.
Gurugram
Experience: Not specified
2.0 - 7.0 Lacs P.A.
Kochi, Kolkata, Bhubaneswar
40.0 - 45.0 Lacs P.A.
Pune, Chennai, Bengaluru
Experience: Not specified
15.0 - 16.0 Lacs P.A.
10.0 - 20.0 Lacs P.A.
Hyderabad, Ahmedabad, Delhi / NCR
20.0 - 30.0 Lacs P.A.
Gurugram
11.0 - 15.0 Lacs P.A.
20.0 - 35.0 Lacs P.A.
1.0 - 2.0 Lacs P.A.