The Securities Technology Analysis Center (STAC) Summit that took place in NYC this fall marks Axellio’s 4th time to participate in the event series. Since joining STAC 2 years ago, we have continued to walk away with great insight coming from collaborative conversations with other vendors and FinTech professionals.
STAC Summit: A FinTech Must
At the 2018 Fall NY conference there were topics ranging from FPGA programming, Crypto-Trading, and even a session on Benchmarking of Machine Learning systems (the data world is rapidly changing – as if I need to tell you!). These topics led to one of those great insight conversations with the CEO of Algo-Logic, John Lockwood. The adoption of FPGAs has been accelerating, bringing a powerful new tool to the world of finance, Networking, and Internet of Things. Algo-Logic has solutions that drastically change the traditional methods of designing solutions such as Tick-to-Trade, Futures & Options order books, Networking search and classification, and sensor data acquisition (IoT).
The Need for Speed
We find that “speed” is driving many of these latest adoptions in the FinTech community. However there is one form of speed in particular that many don’t initially think of: the speed of ingestion and analysis of large datasets in real-time. This isn’t HPC, folks, this is not just massive compute speed but converged storage, networking AND compute, type of speed.
While processing power is still increasing with Moore’s law, more solutions are proving the value of combining live data streams with historical data in real-time. Applying historical context to what’s happening in real-time requires more than processing speed – but also ingest speed and data media speed – to give true enrichment and value to the data we work with daily. For example, a little upcoming regulation known as the FRTB (Fundamental Review of the Trading Book) will require combining of real-time data streams with historical pricing data thus having a large effect on how capital ratios are calculated. So, what are the infrastructure implications of this requirement. Well, Systems that are going to be performing this type of work need to not only consume or ingest as much data as they can, but at the same time look back through or analyze as much data as they can store, as fast as possible. With the ability to store massive amounts of historical data and analyze that data with efficiency, the more real-time data can be “enriched” – meeting the FRTB requirement.
Where Speed Meets Need
Now, you may be thinking “So what, I can do that today.” But, if today looks and feels something like this:
- Large amount of servers, SANs and networking are sprawling out of control
- Transitioning to new modern infrastructure seems expensive and time consuming
- The management is getting cumbersome
There are improvements to be made by adding simplicity of management, efficiency in space and power, and the ability to do more with less to meet budgetary needs. At Axellio we feel all of the above is possible today – or what we have coined as “Where Speed, Meets Need.”
As you come up on massive renewals, and new regulation projects such as FRTB that require a slightly different approach on data “speeds,” FabricXpress should be a part of that discussion.
We are currently looking for opportunities to collaborate and help turn data projects and datacenter life into something less challenging, more modern, and help find a solution where speed meets your need.