Science Philanthropist, Jeffrey Epstein, Backs the First Free Thinking Robots

Share Article

“The challenge in all of this,” Jeffrey Epstein remarked, who funds the project along with the Hong Kong Innovation in Technology Fund, “is to create a robotic nervous system that can perceive concepts in its environment as effectively as virtual avatars.”

Jeffrey Epstein
To date, Jeffrey Epstein’s foundation, the Jeffrey Epstein VI Foundation plays an active role in supporting neuroscience around the world.

Robots are quickly evolving from direction driven machines to free thinking, non-deterministic humanoids, thanks to an ingenious group of artificial intelligence scientists from Hong Kong and Texas; and to the funding of a prominent New York science investor, called Jeffrey Epstein.

For a long time, robots looked largely like clunky machines that relied on deterministic algorithmic pathways. Over the last decade however, Hanson Robotics, based in Richardson, Texas, has revolutionized the appearance of robots with responsive facial expressions, synthesized rubber skin, called frubber and delicate features. Collaborator, Mark Tilden, creator of BEAM robotics and the WowWee Robosapien humanoid robot, has also produced complex robotic movements from analog logic circuits, discrete electronic components, and usually without a microprocessor. And to fill their heads, artificial intelligence group, OpenCog has stepped in to give these humanoid sculptures, a brain.

Based in Hong Kong, OpenCog, an open source AI programming group, under the direction of Ben Goertzel, began their cerebral project with a Hanson Robokind: a toddler like robot. The challenge is to take the basic intelligence of virtual avatars programmed for the screen, and transfer that to a robotic structure. To do this however, a robot must have the capacity to perceive and interpret the outside world.

To a large extent, OpenCog’s virtual characters have that capacity: for example, each character has a database called an AtomSpace where thousands of ‘atoms’ exist as knowledge concepts such as objects, actions and feelings (anger, fear, happiness). Every time a character spatially encounters objects or concepts in its environment, a new atom is duplicated in that character’s AtomSpace. Associations are also recorded as a character moves from one concept to another. Upon repetition, associative links get stronger, influencing a character’s pathway choices and building associative memory. Links can also decay over time if not used by algorithms.

Generalized intelligence from the environment is also built as similarities between concepts trigger an associative network in the AtomSpace. Indeed, several algorithms can function at the same time: such as current associative links and associative networks that are triggered due to similarities. The theory behind this “cognitive synergy” is that humans have multiple thought processes going on simultaneously, prioritizing one’s over others in order to function.

Virtual characters also have basic needs programmed into them which can get depleted or filled by interaction with the environment. The status of a need has a significant impact on which pathways a character chooses to take. For example, if the need for water is high, a character will prioritize a water atom in its pathway choice.

“The challenge in all of this,” Jeffrey Epstein remarked, who funds the project along with the Hong Kong Innovation in Technology Fund, “is to create a robotic nervous system that can perceive concepts in its environment as effectively as virtual avatars.” To date, Jeffrey Epstein’s foundation, the Jeffrey Epstein VI Foundation plays an active role in supporting neuroscience around the world. In addition to establishing the Program for Evolutionary Dynamics at Harvard, the foundation is one of the largest funders of individual scientists, including Stephen Hawking and Nobel Laureate physicists Gerard ‘t Hooft, David Gross and Frank Wilczek.

OpenCog is well on its way to developing a nervous or perceptive system, focusing currently on basic language, sound, touch recognition and pixel imaging sensors through a complex process called DeSTIN Machine Vision. There is still a long way to go, but the goal is rightly ambitious: to not only create a better working model of the human brain, but to explore a greater general intelligence. Jeffrey Epstein is dedicated to investing in science research and education throughout the world.

Share article on social media or email:

View article via:

Pdf Print

Contact Author

Jeffrey Epstein
Visit website