Sign in to follow this  
Cara.

Monkeys control a robotic arm by brainpower alone

Recommended Posts

N.O.R.F   

I saw this on the news but should it really be surprising? Monkeys have been know to be clever animals so does operating a robotic arm to grab food really mean anything?

 

Interesting finding none-the-less.

Share this post


Link to post
Share on other sites
Cara.   

The breakthrough isn't in how clever monkeys are. Even if a genius did it it would be remarkable. The monkey isn't using his hand to move the robotic arm. The arm is hooked up directly to his brain, and somehow he is directing the tiny electric currents that neurons fire (to control our muscles, to think, etc) to manipulate the robotic arm.

 

 

------------------------------------------

Researchers report that monkeys fed themselves using robotic arms controlled mentally—no joystick required. The findings, reported today in Nature, suggest that patients with neuromuscular disorders, spinal cord injuries or lost limbs may one day be able to use their own brain power to operate prosthetics to carry out routine tasks.

 

"This is the first reported demonstration of the use of [brain–machine interface] technology by subjects to perform a practical behavioral act," John Kalaska, a physiologist at the University of Montreal, wrote in an editorial accompanying the study. "It represents the current state of the art in the development of neuroprosthetic controllers for complex armlike robots that could one day...help patients perform many everyday tasks such as eating, drinking from a glass or using a tool."

 

Scientists at the University of Pittsburgh (Pitt) placed two rhesus monkeys in a chair with their arms lightly strapped down to the armrests (and effectively immobilized) while a small grid with 100 electrodes in it was connected to 100 nerve cells, or neurons, in their primary motor cortex, a brain region associated with motion. The sensor grid picked up the neural activity and relayed it to a computer that controlled the prosthetic arm situated near the animal's left shoulder.

 

In an initial "training phase," researchers moved the prosthetic arm using the computer controls so that it moved in front of the monkey, reached out and snagged a treat—a strawberry, grape or marshmallow—dangling on a hook as the animal looked on. The neurons in their primary motor cortex responded to the movements of the arm. According to study co-author and Pitt neurophysiologist Andrew Schwartz, different nerve cells would perk up in response to different directions of movements. For example, some would activate when the arm reached upward for food, others would activate when the arm moved back toward the animals' mouths.

 

After matching neurons to different directions of movement and feeding the information into an algorithm in the software that actually moved the arm, the control was turned over to the immobilized monkey.

 

The monkeys, seeing the treat and wanting to indulge, were able to will the prosthetic—consisting of a shoulder that moved in three directions, an elbow that moved up and down and a clawlike hand that opened and closed—to respond. The electrodes in their brains would then measure activity from certain neurons and send the information to the computer—where it would match corresponding nerve cells with the direction of movement—and the robotic hand would maneuver accordingly.

 

"The animals used the device in a very natural way, making smooth, coordinated movements that look pretty natural," says Schwartz. "They were reaching for small pieces of food in a very precise way."

 

In fact, the monkeys were successful at grabbing and eating the food nearly 61 percent of the time, he says. Schwartz says that he had hoped they would have a better success rate, but noted that the results compared favorably to similar studies where both monkeys and humans move objects in virtual environments.

 

Kalaska says that although the new work is encouraging, there are hurdles to overcome before humans can use so-called neuroprosthetic limbs. A major challenge is to design more durable electrodes because the current crop degrade within weeks of implantation. Another limitation: current prosthetics cannot control the force with which they grip things, which means that a glass, for instance, might be shattered when handled. Schwartz says the team now plans to research ways to build more accurate prosthetic, with a wrist joint and a more humanlike hand.

 

Link

Share this post


Link to post
Share on other sites
N.O.R.F   

No ISH! :eek: I take my words back.

 

I should pay more attention dhe :rolleyes:

 

Maybe they can sort our country out for us :D

Share this post


Link to post
Share on other sites

Amazing walaahi.

 

I took an Artificial Intelligence & Robotics course in uni .. the amount of programming that went into controlling the movements of a stepper motor and by extension a robotic arm was daunting & these were not industrial robotic arms just small scale prototypes .... so this represents a huge short cut if successful.

Share this post


Link to post
Share on other sites
Cara.   

Is the key here the software interface? This seems to be another application of machine learning.

 

This is how robots will take over. Sure, at first the commands will go unidirectionally, from you to the machine, but soon enough the software will decide it would be far more efficient if it tweaks your thinking here and there. Before you know it, that arm you installed to hold the groceries while you fish out your keys will be suggesting that maybe you should cut back on all the carbohydrates and not buy so much bread :(

Share this post


Link to post
Share on other sites
Naden   

Very interesting. Tremendous implications for people who are paralyzed from the neck down or weakened on one (or both) sides by a stroke. More so, I think, for the elderly. I've always believed that technological and medical innovation is the answer to the independence and care of the elderly.

Share this post


Link to post
Share on other sites

Exactly ... the interface was the most difficult part. in our implementation the robotic arm could only move in a two-dimensional plane (X,Y) we had to manually set the depth and the software that we wrote would control the Horizontal and vertical displacement by accepting a linear equation as input ... we put a marker at the tip of the arm and we feed in the linear equation and the arm would draw the graph on a large white board but the movement of the arm was very rigid & jumpy because it was moving in discrete increments.

 

This method would allow you to just 'think' of the motion and you have natural fluid motion ... cutting out the cumbersome middle man.

 

too much info i know :D to summarize just basic movement is alot of work and this looks promising

 

Cara what's with the conspiracy theories they're just tools :D

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this