Necessity is the mother of flow
By Arunabh Satpathy
In a freewheeling talk today, Future in Review founder and CEO Mark Anderson and physicist and computer scientist Larry Smarr laid out a vision for computing that breaks free from current paradigms. According to Smarr, the dominant and static Von Neumann computing architecture is poised to be replaced by “flow computing,” where large data sets are parsed in real time to extract nuggets of useful information.
Smarr gave a few examples of how flow computing is taking over with big data sets. His first example was of the Large Synoptic Survey Telescope in Chile, which follows at most of the observable universe and processes them at 40,000 megabits a second.
“It turns the universe into a video stream,” Smarr said. “That’s a level of flow that literally turning the universe into a movie.”
One of the other central assertions of the discussion was that flow computing was more akin to how the human brain works. Drawing upon the way the human eye reduces the photons of light we receive into readable information, Smarr said that the human brain experiences a “pattern-recognized flow stream every moment of every day.”
Anderson responded by emphasizing the “data triage” aspect of human experience. Data triage is the central aspect of flow computing, in which large amounts of information are watched for relevant patterns, with the rest being rejected.
“We are not seeing the world,” Anderson said. “We’re seeing what we need to see.”
Another example Smarr gave of the emerging flow computing paradigm was of General Electric’s(GE) “industrial internet,” which may be defined as the integration of complex machinery with sensors to gather data continuously. In GE’s case, Smarr mentioned that the industrial internet is generating 10 petabytes a day. This kind of data means that patterns can be analysed and predictions about faults will be made accurately, because behavior can be measured with great fidelity.
“Machines will never stop,” Smarr said. “They will never break.”
He further developed the idea by mentioning that fitness trackers like the FitBit and the Apple Watch distributed across approximately 100 million people could lead to a “true healthcare maintenance system,” based on predictive analytics.
Finally defining the flow computing paradigm, Smarr called it “a set of specialized architectures for computer chips that are designed for real time flows that never stop.”
Anderson wrapped up the session by turning the current hot topic of the “Internet of Things” on its head.
“If you put a trillion sensors out there, there are a trillion streams of information flowing back,” he said. “The problem isn’t the Internet of Things. The real problem from a technology is the internet of flows.”
To discover more or read other articles from the conference, visit StratNews.com or our Medium blog.