How a Brain-inspired Theory May Solve Problems of Big Data

Vast quantities of “big data” are posing major challenges for today’s computers, but there are some potential answers, inspired by the human brain. They are described in a new article by Dr Gerry Wolff of CognitionResearch.org, published in the journal "IEEE Access".

  • Share on TwitterShare on FacebookShare on Google+Share on LinkedInEmail a friendRepost This

(PRWEB) June 22, 2014

We think of supercomputers as being super-powerful. But they can be overwhelmed by the floods of information now produced in science, commerce, government, meteorology, social media, and other areas. And compared with the human brain, they gobble up lots of energy and take up lots of space.

But the human brain may provide some answers in the shape of an "SP machine", based on the "SP theory of intelligence", which is itself based on research into the workings of brains and nervous systems. This new thinking about “Big data and the SP theory of intelligence”—developed by Dr Gerry Wolff of CognitionResearch.org—has now been published in the journal "IEEE Access" [Note 1].

“We can save a lot of energy by using probabilities,” said Dr Wolff. “Instead of doing computations mechanically, we can concentrate our efforts where answers are most likely to be found. We don’t need some special process for gathering information about probabilities because, as a by-product of how the SP system works, it creates a statistical model of its environment.”

Big savings may also be possible in transmitting things like TV programmes. With some further development, the SP system may learn general rules and patterns from that kind of information. If the transmitter of a TV programme, and TV sets, all know those rules and patterns, then a TV programme can be transmitted economically by sending only the parts that are different from the general rules and patterns.

The SP system may help to bring some order into the chaos of different ways in which knowledge is represented in computer systems. In just one area—the representation of images—there are many different formats—JPEG, TIFF, WMF, BMP, GIF, EPS, PDF, PNG, PBM, and more—and each one has its own special mode of processing. “This jumble of different formalisms and formats for knowledge is a great complication in the processing of big data, especially in processes for the discovery or learning of structures and associations in big data.” said Dr Wolff. The SP system may help to simplify things by serving as a "universal framework for the representation and processing of diverse kinds of knowledge" (UFK).

The SP system may also help in such things as recognising patterns in big data, reasoning about big data, and in presenting structures and processes in visual forms that would help people understand big data.

“A useful step forward in developing these ideas would be the creation of a high-parallel version of the SP machine” said Dr Wolff. “This would be based directly on the existing SP computer model, it would be hosted on an existing high-performance computer, and it would provide a means for researchers everywhere to see what can be done with the system and to create new versions of it.”

NOTES

[1] “Big data and the SP theory of intelligence”, J Gerard Wolff, IEEE Access, volume 2, pages 301-315, 2014, DOI: 10.1109/ACCESS.2014.2315297, http://bit.ly/1jGWXDH. This is an open-access article that may be downloaded without charge. Further information about the SP research may be found via http://www.cognitionresearch.org/sp.htm.

[2] Contact: Dr Gerry Wolff PhD CEng, CognitionResearch.org, jgw(at)cognitionresearch(dot)org, +44 (0) 1248 712962, +44 (0) 7746 290775, Skype: gerry.wolff, http://www.cognitionresearch.org, 18 Penlon, Menai Bridge, Anglesey, LL59 5LR, UK.


Contact