{"id":3620,"date":"2019-06-01T11:48:10","date_gmt":"2019-06-01T02:48:10","guid":{"rendered":"http:\/\/163.180.4.222\/lab\/?p=3620"},"modified":"2019-06-01T11:48:10","modified_gmt":"2019-06-01T02:48:10","slug":"bridging-the-gap-between-artificial-vision-and-touch","status":"publish","type":"post","link":"https:\/\/biochemistry.khu.ac.kr\/lab\/?p=3620","title":{"rendered":"Bridging the gap between artificial vision and touch"},"content":{"rendered":"<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<h5>Object manipulation using an innovative glove allows large databases of detailed pressure maps to be obtained. Such data could lead to advances in robotic sensing and in our understanding of the role of touch in manipulation.<\/h5>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<div class=\"article__body serif cleared\">\n<p>The study and replication of human sensory abilities, such as visual, auditory and tactile (touch-based) perception, depend on the availability of suitable data. Generally, the larger and richer the data set, the more closely models can mimic these functions. Advances in artificial visual and speech systems rely on powerful models, known as deep-learning models, and have been fuelled by the ubiquity of databases of digital images and spoken audio (see, for example,\u00a0<a href=\"http:\/\/go.nature.com\/2w7nc0q\" data-track=\"click\" data-label=\"http:\/\/go.nature.com\/2w7nc0q\" data-track-category=\"body text link\">go.nature.com\/2w7nc0q<\/a>). By contrast, progress in the development of tactile sensors \u2014 devices that convert a stimulus of physical contact into a measurable signal \u2014 has been limited, mainly because of the difficulty of integrating electronics into flexible materials<sup><a href=\"https:\/\/www.nature.com\/articles\/d41586-019-01593-w?utm_source=feedburner&amp;utm_medium=feed&amp;utm_campaign=Feed%3A+nature%2Frss%2Fcurrent+%28Nature+-+Issue%29#ref-CR1\" data-track=\"click\" data-action=\"anchor-link\" data-track-label=\"go to reference\" data-track-category=\"references\">1<\/a><\/sup>. In\u00a0<a href=\"https:\/\/www.nature.com\/articles\/s41586-019-1234-z\" data-track=\"click\" data-label=\"https:\/\/www.nature.com\/articles\/s41586-019-1234-z\" data-track-category=\"body text link\">a paper in\u00a0<i>Nature<\/i><\/a>, Sundaram\u00a0<i>et al.<\/i><sup><a href=\"https:\/\/www.nature.com\/articles\/d41586-019-01593-w?utm_source=feedburner&amp;utm_medium=feed&amp;utm_campaign=Feed%3A+nature%2Frss%2Fcurrent+%28Nature+-+Issue%29#ref-CR2\" data-track=\"click\" data-action=\"anchor-link\" data-track-label=\"go to reference\" data-track-category=\"references\">2<\/a><\/sup>\u00a0report their use of a low-cost tactile glove that addresses this issue.<\/p>\n<p>The authors\u2019 glove consists of a hand-shaped sensing sleeve that is attached to the palm side of a knitted glove (Fig. 1). The sleeve contains a force-sensitive film on which is sewn a network of 64 electrically conducting threads: 32 along one direction of the glove and 32 along the perpendicular direction. Each of the 548 points at which these threads overlap is a pressure sensor, because the electrical resistance of the interleaved film decreases when these points are pressed. The output of the glove can be processed as a 32\u2009\u00d7\u200932 array of greyscale pixels, in which the colour of each pixel indicates the applied pressure from low (black) to high (white). These pressure maps are recorded at about seven frames per second.<\/p>\n<p>&nbsp;<\/p>\n<figure class=\"figure\">\n<div class=\"embed intensity--high\">\n<div class=\"embed intensity--high\"><img decoding=\"async\" class=\"figure__image\" src=\"https:\/\/media.nature.com\/w800\/magazine-assets\/d41586-019-01593-w\/d41586-019-01593-w_16738100.jpg\" alt=\"A glove that uses neural networks to identify individual objects, estimate weights and explore tactile patterns\" data-src=\"\/\/media.nature.com\/w800\/magazine-assets\/d41586-019-01593-w\/d41586-019-01593-w_16738100.jpg\" \/><\/div>\n<\/div><figcaption>\n<p class=\"figure__caption sans-serif\"><span class=\"mr10\"><b>Figure 1 | A low-cost glove for artificial touch.<\/b>\u00a0Sundaram\u00a0<i>et al.<\/i><sup><a href=\"https:\/\/www.nature.com\/articles\/d41586-019-01593-w?utm_source=feedburner&amp;utm_medium=feed&amp;utm_campaign=Feed%3A+nature%2Frss%2Fcurrent+%28Nature+-+Issue%29#ref-CR2\" data-track=\"click\" data-action=\"anchor-link\" data-track-label=\"go to reference\" data-track-category=\"references\">2<\/a><\/sup>\u00a0describe a glove that consists of a hand-shaped sensing sleeve (black) attached to a knitted glove (yellow). The sleeve contains a force-sensitive film on which a network of electrically conducting threads (silver) is sewn. The points at which these threads overlap form pressure sensors. The authors show that pressure maps collected by these sensors during object manipulation enable machine-learning models to learn to identify individual objects, estimate the weights of objects and distinguish between different hand poses.<\/span>Credit: Subramanian Sundaram<\/p>\n<\/figcaption><\/figure>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>In Sundaram and colleagues\u2019 study, the glove was worn to record several videos of pressure maps during 3\u20135-minute sessions of single-hand manipulation of 26 everyday objects. This procedure resulted in a database of detailed pressure maps that, to my knowledge, is one of the largest data sets of this kind. The authors found that the glove was flexible, robust and sensitive to small pressure changes, despite its fabrication cost of only about US$10.<\/p>\n<p>To demonstrate that the glove captures different interactions of the hand with each object, Sundaram\u00a0<i>et al.<\/i>\u00a0used the recorded data to carry out automatic object identification. They showed how a state-of-the-art deep-learning model, which was originally designed for large-scale image classification, could learn from the gathered pressure maps to re-identify the 26 objects during blind manipulation. The large number of maps and their spatial resolution proved essential for successful object identification.<\/p>\n<p>&nbsp;<\/p>\n<aside class=\"recommended pull pull--left sans-serif\" data-label=\"Related\"><a href=\"https:\/\/www.nature.com\/articles\/s41586-019-1234-z\" data-track=\"click\" data-track-label=\"recommended article\"><img decoding=\"async\" class=\"recommended__image\" src=\"https:\/\/media.nature.com\/w400\/magazine-assets\/d41586-019-01593-w\/d41586-019-01593-w_16754778.jpg\" \/><\/a><\/p>\n<p class=\"recommended__title serif\">Read the paper: Learning the signatures of the human grasp using a scalable tactile glove<\/p>\n<\/aside>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>Next, the authors used the glove to pick objects up, and showed that a similar deep-learning model could estimate the weights of unknown objects. The glove was also worn during different hand poses, and the signal read by the sensors was detailed enough to distinguish between each pose. Finally, Sundaram and colleagues analysed the collaborations between different hand regions during object grasping by looking at signal correlations.<\/p>\n<p>In addition to providing experimental evidence of well-studied principles that underlie human grasping, this data-driven exploration could improve our understanding of the function of touch during object manipulation. Deep-learning models have greatly advanced our knowledge of the neural mechanisms that underlie visual object recognition<sup><a href=\"https:\/\/www.nature.com\/articles\/d41586-019-01593-w?utm_source=feedburner&amp;utm_medium=feed&amp;utm_campaign=Feed%3A+nature%2Frss%2Fcurrent+%28Nature+-+Issue%29#ref-CR3\" data-track=\"click\" data-action=\"anchor-link\" data-track-label=\"go to reference\" data-track-category=\"references\">3<\/a><\/sup>. In this respect, a similar approach could be applied to the interpretation of tactile-information processing in the brain.<\/p>\n<p>Sundaram and colleagues simultaneously produced pressure maps and corresponding photographs of the hand during object manipulation, generating a large amount of synchronized visual and tactile information. Data sets of multiple forms of sensory perception are uncommon<sup><a href=\"https:\/\/www.nature.com\/articles\/d41586-019-01593-w?utm_source=feedburner&amp;utm_medium=feed&amp;utm_campaign=Feed%3A+nature%2Frss%2Fcurrent+%28Nature+-+Issue%29#ref-CR4\" data-track=\"click\" data-action=\"anchor-link\" data-track-label=\"go to reference\" data-track-category=\"references\">4<\/a><\/sup>, and represent a fundamental step towards the development of multisensory integration systems and an understanding of how the brain develops a coherent perception of the environment.<\/p>\n<p>Such a flexible sensing device might have various applications \u2014 for example, in medical diagnostics, personal health care and sport. But it could also impact on the development of active (externally powered) prosthetic and robotic hands. Tactile feedback has a crucial role in controlling hand movement and exerted forces, such that the lack of this information makes it challenging for both humans and robots to achieve a stable grasp<sup><a href=\"https:\/\/www.nature.com\/articles\/d41586-019-01593-w?utm_source=feedburner&amp;utm_medium=feed&amp;utm_campaign=Feed%3A+nature%2Frss%2Fcurrent+%28Nature+-+Issue%29#ref-CR4\" data-track=\"click\" data-action=\"anchor-link\" data-track-label=\"go to reference\" data-track-category=\"references\">4<\/a><\/sup><sup>,<\/sup><sup><a href=\"https:\/\/www.nature.com\/articles\/d41586-019-01593-w?utm_source=feedburner&amp;utm_medium=feed&amp;utm_campaign=Feed%3A+nature%2Frss%2Fcurrent+%28Nature+-+Issue%29#ref-CR5\" data-track=\"click\" data-action=\"anchor-link\" data-track-label=\"go to reference\" data-track-category=\"references\">5<\/a><\/sup>. Moreover, the sense of touch directly enables tactile exploration aimed at object recognition and localization. It is also known that providing active prostheses with tactile feedback could help to alleviate phantom-limb pain (the perception of pain from a missing limb), increase the sense of ownership over the prosthesis and reduce the cognitive stress involved in controlling the device, by enabling more natural operation<sup><a href=\"https:\/\/www.nature.com\/articles\/d41586-019-01593-w?utm_source=feedburner&amp;utm_medium=feed&amp;utm_campaign=Feed%3A+nature%2Frss%2Fcurrent+%28Nature+-+Issue%29#ref-CR6\" data-track=\"click\" data-action=\"anchor-link\" data-track-label=\"go to reference\" data-track-category=\"references\">6<\/a><\/sup>.<\/p>\n<p>Tactile sensors can be incorporated into a glove that envelops an artificial limb, or directly fixed onto mechanical parts<sup><a href=\"https:\/\/www.nature.com\/articles\/d41586-019-01593-w?utm_source=feedburner&amp;utm_medium=feed&amp;utm_campaign=Feed%3A+nature%2Frss%2Fcurrent+%28Nature+-+Issue%29#ref-CR5\" data-track=\"click\" data-action=\"anchor-link\" data-track-label=\"go to reference\" data-track-category=\"references\">5<\/a><\/sup><sup>,<\/sup><sup><a href=\"https:\/\/www.nature.com\/articles\/d41586-019-01593-w?utm_source=feedburner&amp;utm_medium=feed&amp;utm_campaign=Feed%3A+nature%2Frss%2Fcurrent+%28Nature+-+Issue%29#ref-CR7\" data-track=\"click\" data-action=\"anchor-link\" data-track-label=\"go to reference\" data-track-category=\"references\">7<\/a><\/sup>. In this respect, the technology of Sundaram and colleagues\u2019 device can be adapted to various shapes for integration into robotic or prosthetic arms. Currently, the main limitations are the disadvantages of the required dense sensor coverage of the glove. One disadvantage is extensive wiring \u2014 although the authors used a design of rows and columns to keep such wiring reasonably constrained. Another aspect is the rate at which pressure maps are recorded, which might need to be higher depending on the application (for example, if the tactile feedback were used to control a robotic hand). Nevertheless, I think that the glove in its present form or improved versions of it offer exciting prospects for robotics applications.<\/p>\n<p>An emerging type of machine-learning model has proved effective in mimicking the human ability to learn to perform actions from experience \u2014 a process called reinforcement learning. In the past few years, researchers have used particular gloves to record hand-pose data during object manipulation, and have fed this recorded experience into a model that learns from these data to generate successful manipulations<sup><a href=\"https:\/\/www.nature.com\/articles\/d41586-019-01593-w?utm_source=feedburner&amp;utm_medium=feed&amp;utm_campaign=Feed%3A+nature%2Frss%2Fcurrent+%28Nature+-+Issue%29#ref-CR8\" data-track=\"click\" data-action=\"anchor-link\" data-track-label=\"go to reference\" data-track-category=\"references\">8<\/a><\/sup>. This approach to transferring experience from humans to robots could benefit from the use of Sundaram and colleagues\u2019 data-acquisition glove.<\/p>\n<p>Finally, the current study paves the way for several computer-vision models to be reused for tactile-signal processing, allowing the application of decades of computer-vision research. This approach offers many benefits, such as the removal of various problems involving model selection that slowed progress in deep learning in its early stages. Sundaram and colleagues\u2019 glove could therefore lead to rapid advances in tactile sensing. I am confident that the low cost of the glove will facilitate the replication and sharing of the methodology used to fabricate the device and of the data-acquisition set-up. That would foster the use of large and standard data sets in tactile-sensing research \u2014 currently a major limitation with respect to computer vision<sup><a href=\"https:\/\/www.nature.com\/articles\/d41586-019-01593-w?utm_source=feedburner&amp;utm_medium=feed&amp;utm_campaign=Feed%3A+nature%2Frss%2Fcurrent+%28Nature+-+Issue%29#ref-CR4\" data-track=\"click\" data-action=\"anchor-link\" data-track-label=\"go to reference\" data-track-category=\"references\">4<\/a><\/sup>.<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<\/div>\n<p><span class=\"emphasis\">Nature<\/span>\u00a0<strong>569<\/strong>, 638-639 (2019)<\/p>\n<p>&nbsp;<\/p>\n<div class=\"emphasis\">doi: 10.1038\/d41586-019-01593-w<\/div>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>(\uc6d0\ubb38: <a href=\"https:\/\/www.nature.com\/articles\/d41586-019-01593-w?utm_source=feedburner&amp;utm_medium=feed&amp;utm_campaign=Feed%3A+nature%2Frss%2Fcurrent+%28Nature+-+Issue%29\">\uc5ec\uae30<\/a>\ub97c \ud074\ub9ad\ud558\uc138\uc694~)<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>&nbsp; &nbsp; Object manipulation using an innovative glove allows large databases of detailed pressure maps to be obtained. Such data could lead to advances in<a href=\"https:\/\/biochemistry.khu.ac.kr\/lab\/?p=3620\" class=\"more-link\">(more&#8230;)<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[33,35,36,29,30],"tags":[],"class_list":["post-3620","post","type-post","status-publish","format-standard","hentry","category-do-biology","category-lets-do-computer-science","category-lets-do-physics","category-lets-do-science","category-recent-science-news"],"aioseo_notices":[],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack-related-posts":[{"id":3994,"url":"https:\/\/biochemistry.khu.ac.kr\/lab\/?p=3994","url_meta":{"origin":3620,"position":0},"title":"Bringing machine learning to the masses","author":"biochemistry","date":"August 3, 2019","format":false,"excerpt":"\u00a0 \u00a0 A machine learning tool called Northstar lets users play with data visually. PHOTO: MELANIE GONICK \u00a0 \u00a0 Yang-Hui He, a mathematical physicist at the University of London, is an expert in string theory, one of the most abstruse areas of physics. But when it comes to artificial intelligence\u2026","rel":"","context":"In &quot;Let's Do Computer Science!&quot;","block_context":{"text":"Let's Do Computer Science!","link":"https:\/\/biochemistry.khu.ac.kr\/lab\/?cat=35"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":2668,"url":"https:\/\/biochemistry.khu.ac.kr\/lab\/?p=2668","url_meta":{"origin":3620,"position":1},"title":"Using neuroscience to develop artificial intelligence","author":"biochemistry","date":"February 15, 2019","format":false,"excerpt":"\u00a0 \u00a0 When the mathematician Alan Turing posed the question \u201cCan machines think?\u201d in the first line of his seminal 1950 paper that ushered in the quest for artificial intelligence (AI) (1), the only known systems carrying out complex computations were biological nervous systems. It is not surprising, therefore, that\u2026","rel":"","context":"In &quot;Essays on Science&quot;","block_context":{"text":"Essays on Science","link":"https:\/\/biochemistry.khu.ac.kr\/lab\/?cat=32"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":1527,"url":"https:\/\/biochemistry.khu.ac.kr\/lab\/?p=1527","url_meta":{"origin":3620,"position":2},"title":"Emerging scientific technologies help defend human rights","author":"biochemistry","date":"September 2, 2018","format":false,"excerpt":"\u00a0 \u00a0 (\uc6d0\ubb38: \uc5ec\uae30\ub97c \ud074\ub9ad\ud558\uc138\uc694~) \u00a0 Science\u00a0\u00a031 Aug 2018: Vol. 361, Issue 6405, pp. 859-860 DOI: 10.1126\/science.361.6405.859 \u00a0 Against a backdrop of summer heat and a constant roar of distant howler monkeys, a scientific analyst piloted a drone to collect data from a hillside in northern Guatemala. At his side,\u2026","rel":"","context":"In &quot;Essays on Science&quot;","block_context":{"text":"Essays on Science","link":"https:\/\/biochemistry.khu.ac.kr\/lab\/?cat=32"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":2979,"url":"https:\/\/biochemistry.khu.ac.kr\/lab\/?p=2979","url_meta":{"origin":3620,"position":3},"title":"Robots on the run","author":"biochemistry","date":"March 29, 2019","format":false,"excerpt":"\u00a0 After decades of clumsiness, robots are finally learning to walk, run and grasp with grace. Such progress spells the beginning of an age of physically adept artificial intelligence. \u00a0 Young animals gallop across fields, climb trees and immediately find their feet with enviable grace after they fall1. And like\u2026","rel":"","context":"In &quot;Let's Do Computer Science!&quot;","block_context":{"text":"Let's Do Computer Science!","link":"https:\/\/biochemistry.khu.ac.kr\/lab\/?cat=35"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":1545,"url":"https:\/\/biochemistry.khu.ac.kr\/lab\/?p=1545","url_meta":{"origin":3620,"position":4},"title":"New machine-learning technologies for computer-aided diagnosis","author":"biochemistry","date":"September 4, 2018","format":false,"excerpt":"\u00a0 \u00a0 (\uc6d0\ubb38) \u00a0 \u00a0 Nature Medicine\u00a0(2018) \u00a0 \u00a0 Machine learning can be used for computer-aided diagnosis of acute neurological events and retinal disease and can be incorporated into conventional clinical workflows to improve health outcomes. \u00a0 \u00a0 Machine learning is a branch of data science that trains computers to\u2026","rel":"","context":"In &quot;Let's Do Biology!&quot;","block_context":{"text":"Let's Do Biology!","link":"https:\/\/biochemistry.khu.ac.kr\/lab\/?cat=33"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":4485,"url":"https:\/\/biochemistry.khu.ac.kr\/lab\/?p=4485","url_meta":{"origin":3620,"position":5},"title":"The search for secrets of the human brain","author":"biochemistry","date":"October 18, 2019","format":false,"excerpt":"\u00a0 \u00a0 Large-scale national research projects hope to reveal how it learns, how it controls behaviour and how it goes wrong. \u00a0 \u00a0 Staff members at the Allen Institute in Seattle, Washington, a non-profit research organization that includes the Allen Institute for Brain Science.Credit: Allen Institute \u00a0 \u00a0 Christof Koch\u2026","rel":"","context":"In &quot;'12. \uc778\ub958\uc640 \ubb38\uba85'\uacfc '13. \ub1cc\uc640 \ubb38\uba85' \uad00\ub828&quot;","block_context":{"text":"'12. \uc778\ub958\uc640 \ubb38\uba85'\uacfc '13. \ub1cc\uc640 \ubb38\uba85' \uad00\ub828","link":"https:\/\/biochemistry.khu.ac.kr\/lab\/?cat=45"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]}],"jetpack_sharing_enabled":false,"jetpack_shortlink":"https:\/\/wp.me\/p9Xo1j-Wo","_links":{"self":[{"href":"https:\/\/biochemistry.khu.ac.kr\/lab\/index.php?rest_route=\/wp\/v2\/posts\/3620","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/biochemistry.khu.ac.kr\/lab\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/biochemistry.khu.ac.kr\/lab\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/biochemistry.khu.ac.kr\/lab\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/biochemistry.khu.ac.kr\/lab\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=3620"}],"version-history":[{"count":1,"href":"https:\/\/biochemistry.khu.ac.kr\/lab\/index.php?rest_route=\/wp\/v2\/posts\/3620\/revisions"}],"predecessor-version":[{"id":3621,"href":"https:\/\/biochemistry.khu.ac.kr\/lab\/index.php?rest_route=\/wp\/v2\/posts\/3620\/revisions\/3621"}],"wp:attachment":[{"href":"https:\/\/biochemistry.khu.ac.kr\/lab\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=3620"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/biochemistry.khu.ac.kr\/lab\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=3620"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/biochemistry.khu.ac.kr\/lab\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=3620"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}