Solution and Data Architect
Role purpose:
The role combining both Solution and Data Architect responsibilities, which requires a versatile professional who can bridge technical and business needs, ensuring the organization’s systems and data work in harmony to support strategic goals. The selected candidate will be responsible for creating and maintaining several critical artifacts that serve as blueprints for the Tribe for both technical solutions and data strategies.
Key accountabilities and decision ownership:
• Design the overall structure of software applications and systems, ensuring that they work together seamlessly
• Work closely with Enterprise Architecture CoE and business stakeholders to understand the business needs and translate them into technical specifications and architectural blueprints
• Creates the strategy for how data should be handled across the organization, defining policies and standards for data governance, quality, and security
• Design the logical and physical data models, ensuring that data is structured and stored efficiently across systems
• Designing and optimizing EDW, data lakes, and other data storage solutions, making sure data is accessible for analytics, reporting, and business intelligence
• Oversee the ongoing improvement of existing systems, looking for ways to enhance performance of deliveries (Time to Market), reduce costs, and improve customer experience.
Core competencies, knowledge and experience:
• Capable of diagnosing system and data architecture issues, proposing improvements, and optimizing designs to improve performance and reliability
• Experience with Architecture Design and Tools (e.g.: Draw.io, Lucidchart)
• Experience with Development and Integration Tools (e.g.: Postman, Swagger/OpenAPI, and MuleSoft)
• Experience with ETL/ELT, as well as designing data pipelines that process and move data from source to destination systems
• Proficiency in Python, SQL, NoSQL, PLSQL, or Java for data processing and automation
• Knowledge of Big Data platforms and technologies (e.g., Hadoop, Apache and Databricks)
• Experience guiding technical teams and leadership providing mentorship
Must have technical / professional qualifications:
• 3-year IT or IS degree or diploma or related field is essential
• Familiarity with AI/ML models, data science workflows, and analytics platforms
• Any Cloud Certification (e.g.: AWS, OCI, GCP, Azure) will be valuable
• Certification in TOGAF
Key performance indicators:
Focus on effectiveness of their designs, the quality of the solutions, and the efficiency of the processes:
• Configurability and re-usability: Measured by Time taken from project initiation to solution deployment.
• CIM-compliance standards: consistent nomenclature of data, horizontal data lineage for all data sets, including transformations from producer to consumer systems across business.
• Error rate and self-recoverability - errors or failures in the system resulting in disruptions of Customer Experience.