Plant Homie Movement Prototype 👋🏼

As another prototyping continuation for Plant Homie, the goal of this assignment was to gain practice prototyping gestures and movements in Plant Homie’s ideal environment — indoors.

You can view the first post of the Plant Homie series here 🌱 as well as the second series of the 3D model here ⚙️.

General Overview


Plant Homie offers an IoT solution for your plant giving you three options:

  1. Adding water or light automatically using sensors detecting the soil moisture and root growth
  2. Adding water or light through the mobile app
  3. Adding water or light using the button manually

The purpose of Plant Homie is to give users flexibility in caring for their plant. Whether you’re at home, traveling, or don’t have time to care for your plant, Plant Homie is there for you!

The goal of this prototype is to allows users the ability to make decisions wherever they are. Users may include first-time plant owners, busy plant owners, and IoT enthusiasts.

Scenarios include busy plant owners on-the-move. Those with busy schedules don’t have time to manually water/turn on the light with the switch or use their own mobile devices.

Movement and bodystorming with the device allow quick gestures to be made while on a phone call, work meeting, or in-between errands — especially with the increase and promotion of remote work during COVID-19.

The overall intention of bodystorming is to test whether the actions and its associated gestures are easy to follow, feasible to our ideal users and something testers would even buy themselves.


The prototyping process first began using low-cost and readily available materials to create the base of Plant Homie. This involved playing with different structures and leveraging existing plant pots.

The light/water structure is designed to stretch across the base to ensure all plants evenly receive light/water. Additionally, the height can be adjustable according to the height of the plants.

The idea was to allow users to customize their plant type, size, or height by designing a universal base. More details about the prototyping phase can be found in the first post of the Plant Homie series here 🌱

In incorporating bodystorming techniques, here are the intended actions and associated movements that will be tested below.

These actions focus on the convenience of users, allowing quick gestures to be made for simple tasks while promoting quick multi-tasking. The ease of the gestures makes the product more desirable and usable from most average plant devices.


After testing with one bodystormer inside their work-from-home workspace, here are some initial findings.

Things that worked well:

  1. The clap gesture followed a similar existing mental model to turn on/off lights. This made the gesture simple and easy to remember.
  2. The overall actions and gestures was something the user would want to use and recommend to others. The tasks were simple and easy to follow.
  3. The associated actions and gestures didn’t have obvious misuse cases for the intended users. Although typical misuse cases can include non-intended users such as pets/children or accidental gestures made.

Things that needed improvement:

  1. Creating a visible indicator of a sensor detecting the gestures within the prototype. The tester wasn’t sure where the prototype would detect their gestures and movement, for example, where did they have to wave to turn on/off the water.
  2. Adjusting the water strength by pinching can be difficult to control and accurately use. Tester recommends not making water strength a gesture feature due to its limitations and potential frustration from a larger user group.

Below you can view the bodystorming video:


Overall the effectiveness of the gestures was indicated through concept testing. The concept was desirable and feasible based on the feedback session held.

After concluding my testing and gathering feedback, here are a couple of things I would change and learned while creating this prototype.

Changes I would make:

  1. Going back to the prototyping steps, I would consider adding a motion sensor detector to better indicate where the gestures should be made. This includes adding a sensor above the light/water lamp so users know where to wave and for how long.
  2. If I were to convert the water adjustment back to a physical button instead of a gesture, I would consider adding a new gesture but for adjusting the height of the light/water lamp. I would add sensors at the top and bottom of the lamp base to allow users to hover a finger over it.

Things I learned:

  1. The clapping to turn on/off light would need to be reconsidered as a motion gesture. Typically existing clap features detect the noise, not the gesture being made. I didn’t realize this before testing and conducting further research on this.
  2. Movement prototyping in general is an interesting concept and something I would have not considered looking into. I am eager to dig into how movements/gestures work while considering misuses and users with limited mobility.