Asking too much of visualization technologies?
-Kumar Metlapalli, Founder & CEO Kuberre Systems
November 20, 2020
Visualization technologies are hot right now and for good reason. The ability to transform complex data analysis into something that is relatable to decision makers is invaluable. Leaders across commercial and government organizations want to make data-driven decisions which has created demand for the many leading visualization platforms we see every day. However, when we tip the scales too far toward using these platforms to produce the analytics needed to create visualizations, we may be asking too much of them.
I’ve spent my career working with large data sets that are used both to fuel the many systems that asset managers rely upon to run their operations, but also to interrogate data with the goal of creating new trading strategies. The common thread in all of this work is striking a balance between our desire to learn with the taxation we place upon infrastructure. This taxation comes in the form of cost for vendor provided technologies, the impact we place upon critical production systems during the business day and human capital required to drive the process. Over the past 4-5 years we have been hyper focused on finding ways to meet the analytical hunger that is growing with abstraction techniques that protect critical systems and infrastructure while also making massive improvements in performance. When all the pieces are in place the results are simply amazing.
So, what is the punchline to this article? The punchline is that your firm may be placing un-rewarded risk on critical infrastructure without a data virtualization approach that abstracts the sources of data from the visualization effort. Even if the systems risk is deemed minor, the latencies involved in large data sets that reside in many disparate locations is limiting. Even in scenarios where mature enterprise data management may be in place to support the many systems and needs of the organization, the analytics part of the house may still need special infrastructure to be able to interact with that data in a performant manner. Whether being sourced from multiple systems/data bases and/or a sophisticated data lake… it is critical that the data is available for analysis in a state which allows it to be put to work immediately. This is where we find true “advanced virtualization” is the natural partner to “visualization”.
How would you know if the problems you are solving for might be better served by pairing with virtualization and a data analytics fabric? Are you nimble enough to provision new data sets easily to support your quant team’s mission to discover alpha? Are you stitching certain results together manually? Are you creating derived data that then needs to be used for additional analysis? Are you building lots of “joins” that bring the performance down and complexity up? These are all symptoms of a data operation that would likely benefit from strength of a virtualization-based analytics approach that can feed the visualization engineer of your choice.
If you would like to reach out with questions or wish to discuss this topic in more depth, I welcome you to reach out to us though this page or by email me at info@kuberresystems.com. Thank you as always for dedicating some of your precious time reading about our perspectives on data oriented subjects.