You are on page 1of 38

INTRODUCTION TO INTELLIGENT ROBOTS

COURSE
Undergraduate course (Spring 2019)

Nguyen Do Van, PhD

1
Our Introduction
• Student:
– Name, specialization, what you love?
– How did you know the course?
– What do you think about the course?
– What do you expect?

2
Course Objective

• Know what it takes to make a robust intelligent robot work


• Sense, think, act
• Understand the importance, approaches, research issues and
challenges in intelligent robots
• Know how to program an autonomous robot

3
Need some information
• Pre-course:
– Python, Ubuntu
– Robotics??
– Artificial intelligence,
– Machine learning,
– Computer Vision

4
Course Information
• What is in the course
– Introduction to Artificial intelligence and Autonomous Robots
– Simple Robotics Programming on Virtual Environment
– Robot perceptron: vision, sound
– Mobile robots: Robot Localization and Mapping
– Manipulated robots: Path Planing and Controlling
– Intelligence robots: Machine learning, Reinforcement learning
(advanced)
– Assigments and Grade systems
– Orders and contents may change during the course

5
Lecturers
• Instructors: Nguyen Do Van, PhD
• Practice Instructor: (Decide later)

Website for INT 3409-1: https://uet-ai-robot.moodlecloud.com


Website for INT 3409-21: https://uet-ai-robot-02.moodlecloud.com

6
ARTIFICIAL INTELLIGENT AND ROBOTS

7
Intelligent Robots
• Mechanical creature which can function autonomously
• Mechanical: built, constructed
• Creature: think of it as an entity with its own motivation,
decision making process

8
Intelligent Robots
• Basic robot primitives: Sense/Learning/Think/Act
• Three Acting Paradigms:
– Hierarchical: Sensing -> Planning -> Acting
– Reactive: Sensing -> Acting
– Hybrid: Planning -> Sensing -> Acting
• Learning to think, learning to act

9
Heuristic planning to learning

10
Warehouse Robots

Kiva at Amazon Fetch


Entertainment Robots

12
Home Assistant Robots

13
WHAT IN THE COURSE

14
AI and Robotics
• Robotics
– Autonomous control
– Mechanism, planning and navigation
– Sensoring
• Autonomous Robots
– Robot Mechanism
– Navigation
– Sensor-driven
• Intelligent Robots
– Intelligent systems
– Learning Robots

15
Virtual Invironment
• AI2-THOR v1.0 consists of 120 near photo-realistic 3D scenes
spanning four different categories: kitchen, living room,
bedroom, and bathroom
• Concepts
– Agent
– Scene
– Action
– Object Visibility
– Receptacle
Machine Learning

17
Perceptron - Computer Vision
– Computer vision is the task of
understanding the images captured by a
camera
– Vision tasks that are simple for humans to
do, can be very difficult for a computer
– For example, color each fruit a different color

What we see What a computer sees


Computer Vision for Robotics

Image (or video) Sensing device Interpreting device Interpretations

garden, spring,
bridge, water,
trees, flower,
green, etc.
Computer Vision for Robotics
• Objects Recognition and Object Tracking

20
Dead Reckoning
• Estimates a position based on the change from a previous position
– Estimate its position using only sensors that are available on the robot
• Does not require maps or outside references
– Robot’s location require maps or external references like GPS or signal strength
– Perform even when the robot is in unfamiliar territory and is unable to
communicate.
• Utilizes sensor data and the precise physical measurement of the robot:
– Internal sensor to estimate distance of travel
– Robot information: physical dimensions, size of its wheels, layout of the wheels

21
Sensor Fusion
• Each type of sensor offers different information about the
environment
• Combining data from multiple sensors improves the ability of a
robot to understand the world
– Compass and gyroscope for dead-reckoning: Combine with Accelerometer
to determine linear velocity
• Sometimes sensors disagree, so engineers must design ways to
reconcile different inputs.
– Some sensors may have high uncertainty
– Detecting faults in sensors is made easier when multiple sensors are used
because errant measurements will be more obvious.

22
Planning and Controlling

– Robots must make decisions that consider their entire


environment
– Robots would be ineffective if they only consider their
immediate sensor measurements
– Planning is the procedure of devising a strategy for
achieving a goal based on a global perspective of the world.
Path Planning
• General Goal: compute motion commands to achieve a goal
arrangement of physical objects from an initial arrangement
• Basic problem: Collision-free path planning for one rigid or articulated object (the “robot”)
among static obstacles.
Inputs
• geometric descriptions of the obstacles and the robot
• kinematic and dynamic properties of the robot
• initial and goal positions (configurations) of the robot
Output
• Continuous sequence of collision-free configurations connecting the initial and goal
configurations

24
Localization and Mapping

– Localization is the task of determining


a robot position
– Mapping is the task of building an
accurate map of the environment
– Simultaneous Localization and
Mapping (SLAM)
– A process of dynamically building a map of the world
around a robot while estimating the position of the robot
within this map
– Application
– Precision driving
– Room to room navigation
Navigation

– One of the most important applications of


planning is for navigation
– Navigation typically involves two levels of
planning:
– global plans
– local plans
Assignments and Examinations
• First assignment: For everyone
– Build an world on ai2thor
– Control either by keyboard, script
– Short Report and Presentation
– Times: later
• Middle term examination:
– Written form
– Content: Artificial Intelligence in Robots
– Time: later

27
28
Features in Machine Learning
– Features are the observations that are used to form predictions
– For image classification, the pixels are the features
– For voice recognition, the pitch and volume of the sound samples are the features
– For autonomous cars, data from the cameras, range sensors, and GPS are features

– Extracting relevant features is important for building a model


– Time of day is an irrelevant feature when classifying images
– Time of day is relevant when classifying emails because SPAM often occurs at night

– Common Types of Features in Robotics


– Pixels (RGB data)
– Depth data (sonar, laser rangefinders)
– Movement (encoder values)
– Orientation or Acceleration (Gyroscope, Accelerometer, Compass)
30
Reinforcement learning
• Making good decision to do new task: fundamental challenge in AI,
ML
• Learn to make good sequence of decisions
• Intelligent agents learning and acting
– Learning by trial-and-error, in real time
– Improve with experience
– Inspired by psychology:
• Agents + environment
• Agents select action to maximize cumulative rewards
• What we discuss: Pure RL and Deep Reinforcement learning
RL Applications
• Multi-disciplinary Conference on Reinforcement
Learning and Decision Making (RLDM2017)
– Robotics
– Video games
– Conversational systems
– Medical intervention
– Algorithm improvement
– Improvisational theatre
– Autonomous driving
– Prosthetic arm control
– Financial trading
– Query completion
Assignments and Grade

Tasks Methods Weight

Assignments 02 group projects 15%


Mid-term test Written test 10%
Practice Programming in each lecture 15%
Final test Oral 60%
Overall 100%

33
Lab introduction
• HMI Lab:
– https://hmiuet.wordpress.com
– https://www.facebook.com/hmiuet/

34
Artificial Intelligence – Computer Vision Group

35
36
Intelligent Robot Team

Recruit:
AI & CV group member
Intelligent Robot Team Member
Seminar: Saturday, 101, G2
Lab: 307, E3

37
What do WE do?
• Instructors, TA: Give lectures
• Students: Comprehend and Do Assignments
• WE: Do together
• Q&A

38

You might also like