MedBot : Telepresence Food Delivery & Sanitation Robot for COVID-19 Wards

MedBot: Personal Project (Jun 2020-Aug 2020)

As the entire world stands united to fight against the wrath of COVID-19, if there is one industry that has blossomed amid this crisis, it is Robotics and Automation. Leveraging this opportunity and our skills in this domain, our team was determined to develop technology that had the potential to assuage the risks faced by the front-line warriors in hospitals throughout the country. To this end, we partnered with doctors at Kodagu Institute of Medical Sciences, Madikere leading to the development of our first commercial product. The goal was to develop a simple and ergonomic line-follow and remote-controlled robot with added virtual telepresence that can be used to deliver food and essential supplies to patients being treated at their COVID-19 isolation ward while enabling doctors to communicate with patients remotely. This blog will walk you through the process of our build and the outcome we were able to achieve.

The first part of the build was the design of the robot. Over the years of building Robotics solutions to various problems in society, we have learned that the aesthetics of a product is as important as its functionality for it to do well in the market.  We were very excited by the final outcome of the robot. Here’s a rendered image of MedBot, designed using AutoDesk Fusion 360.

Fig 1: MedBot rendered image

Since careful consideration was given to the ease of manufacturing, the production of this robot from concept to reality was relatively simple. The material chosen for the exterior frame can be scaled easily when fabricated using laser cutting techniques to reproduce accurate dimensions and tight tolerances. In addition, it enables us to carve out columns that sport gloriously accentuated curved profiles which also serve to improve stability of the structure and allow the trays to protrude outwards making its contents more accessible to the user. The MDF panels that were cut out were thin 3 mm sheets which were then adhered together to produce stable structural elements. The various elements of the robot were joined together with the aid of jigs and then reinforced with adhesives and fasteners. The cuboid base of the robot with two control panel access openings houses the electronics and encloses the wheels to produce a clean exterior that engulfs all moving elements. The three trays that are supported by the vertical column are used to carry the food and essentials for the patients in isolation. As mentioned earlier, each tray protrudes outwards more than the one before it to enable easier access to the user. The top of the robot supports a user control interface that houses the main power switch, the keypad for user input, an ultrasonic sensor, and a joystick for manual control. It angles in towards the user to provide a comfortable viewing and operating experience. Finally, a tablet mount was provided to allow remote telepresence of the doctor for interaction with the patients in the isolation wards.

Currently, MedBot has two variants: one that can be teleoperated remotely and another semi-autonomous version that can be made to traverse a predefined path. A fully autonomous version is rapidly under development and will be available for sale soon. The working of the semi-autonomous variant is as follows:

a) The nurse uses the manual control mode to align MedBot to the track that it needs to follow. User presses “0” and “A” to toggle between the manual and autonomous modes of operation respectively.

b) A joystick mounted on the user control interface is used to maneuver the robot in the required direction and align it with the track that it needs to follow.

c) The nurse then presses “A” once MedBot is loaded with the food and medicines that need to be delivered. Using the keypad, the nurse enters the bed numbers to which food needs to be delivered, and the robot starts traversing the path once the entry is complete. It follows a predefined track with the aid of a set of 3 IR sensors that form an array. It also has an ultrasonic sensor that is used for obstacle avoidance.

d) The robot stops near each bed at “nodes”, which are thick black segments identified on activation of the entire sensor array. Once the robot stops at the entered bed number, the patients can take their food and then place their hands over another ultrasonic sensor on the user control panel to confirm that they have taken their food. This feature ensures that at no point will the patient have to physically touch any components of the robot.

e) This sensor’s activation will trigger the robot to move to the next bed or return to its home base, where the robot can be sanitized and the process is repeated.

The PCB was designed using AutoDesk Eagle and fabricated in-house. The wiring and sensors were then finalized and soldered to the PCB to ensure a stable working system. The battery used was a 7.7Ah 12V Lead Acid battery. Two 12V 150 RPM Johnson Motors power the wheels which are driven by two BTS 7960 (40A Peak current) motor drivers. Castor wheels are mounted to the base to serve as supporting elements. An Arduino Mega microcontroller is used for interfacing the functionality of this system. The robot also has a 12V charging port and a battery voltage indicator at its rear end on its cuboidal base as shown in Fig 3. The completed model of the robot is as shown in the images below:

Fig 2: MedBot – Front View
Fig 3: MedBot – Rear View

We hope that this simple solution finds meaningful use in hospital isolation wards to minimize potential hazardous contact between the healthcare workers and the COVID-19 positive patients.

  • From the feedback from doctors, we decided to make the next prototype of this robot WiFi controlled, and also upgraded the mechanical build of the robot to make it more durable and robust. The following video summarizes all of its basic features: