Every (ICT) system will at some point come to the end of its lifecycle. The challenge of maintaining meaningful (semantic) access to data in legacy systems is becoming more and more of a burden on ICT departments. For various reasons, including risks to cybersecurity incidents/attacks on outdated systems, it is unfeasible to keep applications/servers operational beyond their lifespan even if it would be technically possible. Business processes evolve continuously, customer expectations and requirements change even faster. Process management and workflow support systems have a tendency to grow organically into fearsomely complex monolithic beasts while piling up technical debt in the process.
Data migrations come at high cost, changing systems almost always require data to be converted into the new systems particular data model, which makes doing data migration verification both complex as well as costly.
Using the open source/free software SQL database engine PostgreSQL, one of the leading Relational Database Management Systems, and in recent years the most popular database engine used in startups and scaleups, takes away the concern of future accessibility and integrity of the data itself.
A middleware layer/REST API built with Symfony framework and a ReactJS based frontend ensure a fully responsive modern UX. Furthermore we use GitLab and Ansible for project management and automating our CI/CD environment.
At OpenNovations we’ve created a solution called Aranei built with standard open source tools which generates a state of the art, open web standards based, user interface on top of any existing SQL data structure. This ensures full access to the data in its original context, using the existing metadata about the data model to define the UI elements like data tables, dashboards, reports, etc. Keep the same schema/structure between source system and archiving solution, or to use the notion of a controlled copy, is a much less cumbersome approach, as a one on one data verification can be performed.
Not just the UI components themselves are generated by the metadata app modeling engine, also automated test scripts and a data dictionary are generated as part of the data verification process.
Validation strategy of such an approach is much more straightforward than any form of data model conversion that includes changes to the data model itself. Keeping data in its original structure and context allows for most parts of the validation and verification steps to be automated and generated instead of having to be manually created.
Even when newly developed web standards cause backward incompatibility in the UI layer of this system, a newly updated UI can be created in a reproduceable and reliable way through the re-generation process of the UI components.
RAD, cybersecurity, medical grade regulatory compliance and open source go hand in hand!
May 20, 2022 from 12:15 to 12:45
Speaker: Hans de Raad