TOP
從紙書中看見香港,指定港書滿888再折100
Development of Intelligent Robot Grasping Strategies using Machine Learning and Deep Learning Techniques for Cobotics Framework
滿額折

Development of Intelligent Robot Grasping Strategies using Machine Learning and Deep Learning Techniques for Cobotics Framework

商品資訊

定價
:NT$ 1216 元
無庫存,下單後進貨(到貨天數約30-45天)
下單可得紅利積點 :36 點
商品簡介

商品簡介

Manipulating robots are supposed to be functioning like our hands and like our hands it should have an intelligent grasping ability to perform complex manipulation tasks. However, for robots executing an intelligent and optimal grasp efficiently, the way we grasp objects, is quite challenging. The reason being that we acquire this skill by spending a lot of time in our childhood trying and failing to pick things up, and learning from our mistakes. For robots we can't wait through the equivalent of an entire robotic childhood. To streamline the process, in the present investigation we propose to develop deep learning and machine learning based techniques to help robots learn quickly how to generate and execute appropriate grasps. In this context, for vision based object detection, we have designed an effective loss function, Absolute Intersection over Union (AIoU), for faster and better bounding box regression which has been verified using You Only Look Once version 3 (YOLOv3) and Single Shot Detection (SSD) algorithms. Subsequently, on detected objects, for grasp generation, we develop genetic algorithm based grasp position estimator with deep reinforcement learning based grasp orientation estimator using Grasp Deep Q-Network (GDQN). Since all deep learning and reinforcement learning techniques are data hungry, and there is scarcity of sufficient labelled data, we try to overcome the challenges by proposing a hybrid (discriminative-generative) model, based on Vector Quantized Variational Autoencoder (VQ-VAE). More specifically, we develop two stateof-the-art models. One a Generative Inception Neural Network (GI-NNet) model, capable of generating antipodal robotic grasps on seen as well as unseen objects which is trained on Cornell Grasping Dataset (CGD) and performed excellently by attaining 98.87% grasp pose accuracy by detecting the same from the RGB-Depth (RGB-D) images for regular as well as irregular shaped objects while it requires only one third of the network trainable parameters as compared to the State-Of-The-Art (SOTA) approaches. For other model we integrate VQ-VAE with GI-NNet, which we name as Representation based GI-NNet (RGINNet). This model has been trained utilizing the various splits of available CGD dataset to test the learning ability of our architecture starting from only 10% label data with the latent embedding of VQ-VAE to 90% label data with latent embedding. The performance level, in terms of grasp pose accuracy of RGI-NNet, varies between 92.13% to 97.75% which is far better than many other existing SOTA models trained with only labelled dataset. For the performance verification of all the proposed models for grasp pose estimation, we use Anukul (Baxter) Cobot and it is observed that our models perform significantly better in real-time tabletop grasp executions. Since the ultimate Cobotics (collaborative robotics) framework development requires smooth/seamless human-robot interactions, we also develop a fusion model utilizing multiple modes of communications such as speech and gesture, using Long Short Term Memory (LSTM), Convolutional Neural Network (CNN) and 3-D CNN on a humanoid robot framework, NAO. Finally, we want cobots should be able to execute grasps based on learning, and therefore, we also address the robot grasping manipulation at the execution level such as, solving an inverse kinematics problem using reinforcement learning techniques

購物須知

外文書商品之書封,為出版社提供之樣本。實際出貨商品,以出版社所提供之現有版本為主。部份書籍,因出版社供應狀況特殊,匯率將依實際狀況做調整。

無庫存之商品,在您完成訂單程序之後,將以空運的方式為你下單調貨。為了縮短等待的時間,建議您將外文書與其他商品分開下單,以獲得最快的取貨速度,平均調貨時間為1~2個月。

為了保護您的權益,「三民網路書店」提供會員七日商品鑑賞期(收到商品為起始日)。

若要辦理退貨,請在商品鑑賞期內寄回,且商品必須是全新狀態與完整包裝(商品、附件、發票、隨貨贈品等)否則恕不接受退貨。

定價:100 1216
無庫存,下單後進貨
(到貨天數約30-45天)

暢銷榜

客服中心

收藏

會員專區