Spend money on data you actually need.
Processing data costs time and money. Should you process in ‘realtime’ or something else?
Understand these terms and/or define them yourself. Either way, ensure your team know what you mean. As data grows the cost may outstrip the value if the wrong mode of processing is chosen for your organisation.
- Real-time
Signals, (such as computer generated events or other messages) are received and processed within seconds of its original release. E.g within 5 elapsed seconds from release to processing.
(Processed = human readable and/or parsed and understood by the receiving system)
- Near-time
Signals, as defined above, are received and processed within minutes, measured in single digits. E.g less than 10 elapsed minutes, from release to processing.
- Past-time
Signals are received, but only usable in retrospect. I.e regardless of whether they were received in real-time or not, due to their low priority, these signals are not worth processing until some time later.
Past-time is measured in double digits. Minutes or even hours later.
Past-time is the signalling equivalent of ‘eventual accuracy’.
So what?
The burgeoning volume of (seldom used) data now being sent, collected and received from computer systems, including IoT, means that organisations need to know why, for whom and for what reason they are storing this data. They can then decide the priority they will allow it to have on their networks, their storage systems, and processing power.
Organisations must understand the cost of storage and processing data versus the real value of data.
Think!
Do you really need real-time? The signals are only data, they need to be processed to become information, only then can value be derived from them.