Examples of web streaming have become rather predictable and yawn-worthy. It’s always some variation of streaming real-time stock market data, news and status updates from the cloud to your browser, tablet or phone – classic filtered fan-out data distribution. Sure, there are a few upstream bits like the character inputs used for real-time keyword search completion, or chat applications, but the upstream data is a trickle compared to the fire hose coming downstream. However this model is beginning to flip directions and applications are more frequently streaming large volumes of upstream data with a downstream trickle.
Consider how most Big Data is being collected at the server side today. Click streams, log data, activity streams, search queries – they are all pouring into Kafka, Scribe, or Flume and ending up in a variety of big data repositories. As users increasingly run thicker smartphone, tablet or desktop apps the view from the web server becomes less and less complete.… Read the rest
After all, we’re talking about connecting over the public internet, where the best you can hope for is about 20-30 milliseconds round trip time just for the network. So who cares if the infrastructure distributing data to the RIAs adds a handful of milliseconds? That’s a small percentage of total latency right?
RIAs Not Just for the Public Internet
RIAs may have been created with the public internet in mind, but they’re so feature rich that companies are now using them as a single application/interface for people connecting over all kinds of other networks including:
- leased lines (between offices)
- wifi (think iPad on the trading floor)
Suddenly that RIA infrastructure latency that seemed like noise in context of an internet round trip time is significant on a much faster network.… Read the rest
It seems like forever ago that Pointcast introduced us to the first incarnation of so-called “push” technology. But aside from stock prices through online brokerages, the real-time streaming of content didn’t really take off until very recently. Even today, many people still refresh their browser to see if they’ve been outbid on eBay and reload the sports scores on their smartphone to see if their team held on for the win.
There are three key factors driving recent, rapid increases in the production and consumption of real-time data:
- Changes in technology – For years, developers have worked around the limitations of browsers and HTML by building thick clients or using COMET to clumsily force data changes into browser applications. The emerging HTML5 standard (with the backing of Google, Apple and Microsoft) will make real-time data a first class citizen over the internet. At the same time, turnkey infrastructure that integrates back end systems all the way through to browsers and mobile devices is simplifying development and deployment of these applications.
Most internet streaming architectures consist of two very different technologies glued together at the DMZ. Inside the organization you’ll find integration-centric technology (i.e. messaging middleware) that efficiently shuttles data between internal systems. Outside the firewall, presentation-centric technology takes care of getting data in front of users via web or mobile applications.
In between is a gateway that has to understand both environments – corporate messaging semantics on one side, web and mobile application semantics on the other. Each part of this chain—internal message bus, gateway, and web streaming infrastructure—needs to be independently managed, monitored and scaled so none of the components become the weakest link, under normal or peak conditions.
With enough determination this approach can work, but it’s kind of like building your own car—it’s expensive, time-consuming, and you better keep a toolbox in the trunk, because it will break down and you’re the only one who’ll be able to troubleshoot and fix it.… Read the rest
On Wednesday, lab49‘s CTO Matt Davey presented a case study entitled Optimizing High-Performance Trading Solutions: An Engineering Perspective to an SRO audience of over 100 people at the Waters North American Trading Architecture Summit. Lab 49 is a leading technology consulting firm with a track record of innovation in delivering cutting edge solutions to many of the big investment banks, hedge funds and other capital markets firms.
During the session, Matt demonstrated an innovative ETF arbitrage system and a new Single Dealer Platform (SDP) solution that Lab 49 has developed using Solace’s recently announced web messaging infrastructure. In each case, the combination of RIA applications, integration with data sources and high-volume, low latency distribution between the two are the key’s to the system’s architecture.
Matt’s presentation was one of the best attended at the event, and underscored the rabid interest in capital markets for distribution of real time information to RIAs.… Read the rest
We are very pleased to announce a new capability in the Solace product lineup. For many years now, we have dedicated our company to reshaping how messaging enables distributed computing for enterprise applications. Today, we announced that we’re taking our technology beyond the boundaries of the corporate network and into the wild world of the internet.
We’re not the first kids on the block to take on streaming real-time data to rich internet applications (RIAs), but we believe we’ve done it the best. And not by a little bit, but by a country mile. The problem of filtering and fanning out data to lots and lots of users is a tailor fit for the technology we’ve built over the past 8 years. We’ve already solved this problem for intranets and WANs, and by adding internet protocols, and some security and management to account for the internet part of the equation, we bring a new level of performance, simplicity and reliability never before seen in this market.… Read the rest