At the 2019 South African Monitoring & Evaluation Association (SAMEA) conference, I had the opportunity to share my thoughts on “Designing the next generation of MERL Tech software”.
Like many sectors, traditional Monitoring & Evaluation is being disrupted by technology. The list of interesting applications for Blockchain, Artificial Intelligence, Machine Learning, Sensors, Drones (and many more) goes on and on.
During the 2018 MERL Tech conference held in Johannesburg, our team ran a Design Thinking workshop which sought to identify potential gaps in the MERL Tech sector, one of which was labelled as the “Interoperability gap”. (See here for a full list of the gaps and a report we published on the topic). It is the Interoperability gap that I chose to focus on in my SAMEA presentation.
Firstly, some non-technical members of the MERL community may be unfamiliar with the the term Interoperability but it is one of the most important and widely used concepts in the modern world. In essence, interoperability allows two systems to interact with one another to exchange or make use of information, without each system knowing anything about the internal workings of the other. In software, this interaction is generally facilitated by each system exposing certain functionality via a standardised interface (called Application Programming Interface or API) and using common data standards. The two systems then communicate via their respective APIs.
In the context of MERL, the flow of information is vital. It may be useful to think of interoperability as a valve in the information pipeline. If the systems/tools are interoperable, the valve remains open and the information can flow. If the systems/tools are not interoperable, the valve closes and the information gets stuck. You might have experienced this before where it is difficult for you to get data out of or into a software tool.
In the early days of MERL Tech, there were very few mature standards. When we started out developing mobile data collection software more than a decade ago, we had to work within extreme resource constraints. For instance, the Facebook mobile app today is 800x larger than the memory available to us on the early feature phones. At that time, even if standards did exist, the overhead involved in implementing them was prohibitive.
The upside to this nascent period was that early developers like ourselves could innovate and move quickly without having to be concerned with standards and interoperability. But as the sector has evolved and matured it has become more important to focus on interoperability than exclusively on siloed innovation.
Side note: It’s worth mentioning that “open source” and “interoperable” are not the same thing. You can have an open source system that does not follow any standards and is close to impossible to integrate with; and you can have a proprietary system that is highly interoperable.
From the very first version of our tools, we have offered an API allowing data collected using our tools to be pushed easily into other systems, but we used our own format to store the underlying structure of forms (for the reasons mentioned earlier of being able to innovate quickly). To follow through on our view that interoperability has become critical to the overall success of the MERL Tech sector, we took a big step in 2018 to completely rebuild our tools to use the standards that we felt had emerged as the most appropriate. An immediate benefit is that users can port the forms they had already developed using other tools (such as ODK) to ours and vice versa.
The list of features and technologies that will continue to be released and benefit the MERL sector is endless, but I believe that the MERL Tech as a whole will truly flourish when organisations are free to use the tools that best fit their current needs, not those they are locked into using.
If this is the case, then perhaps interoperability is going to be the next killer feature in the MERL Tech sector.