Kindbot: Quantified Cannabis

With an interest in gardening and some experience with outdoor, we decided to bring our training into our first indoor home grow. With our first grow tent, we were interested in:

  • low cost - it shouldn't cost as much as the dispensary
  • top quality - the quality should rival the dispensary's
  • limited manpower - we are busy and don't want more chores

Though LEDs may run cooler, we opted for the tried-and-true HID lighting for power at a low starting cost. However, putting MH/HPS lights in a small space will exacerbate any heat issues when the ambient home temperatures are warmer. When ventilating through an open window, we found evenings where the temperatures dropped too low, even slowing growth.

We experimented on cheap development boards with environmental sensors to monitor temperature and humidity. Then we could easily recieve notifications on our phone when temperatures are getting too high or low. These so-called plant monitors are configured using different microcontrollers and sensors, including those for measuring soil moisture or light intensity.

The search for the right tool

Ultimately, we didn't want notifications about problems, but rather the assurance that the environment would just be right. We didn't want to stuff pricey, specialized greenhouse equipment into the tent, but we could harness our apartment's AC to cool the space. We learned about environmental controllers but could find nothing well-suited for small home grows.

With self-driving cars and home automation making its way into consumer products, we asked how there exists no robotic gardener that keeps the ideal conditions using feedback from the environment. So with our backgrounds in math and computer science, we began designing our ideal home gardening tool using these powerful new technological paradigms.

Data driven and AI powered

We had several use cases in mind for this new tool:

  • Control appliances (fans, AC, irrigation) to maintain our target grow conditions
  • Log environmental data
  • Send an alert when a human is needed
  • Report plant health

With such general functionality, having a computer in the tent was the way to go. Plants don't tell us how they're feeling, they instead show us visually. So we included a camera, which helped us go beyond simple statistics about air quality using computer vision. This collection of techniques allows computers to make better decisions informed by the feedback of observing the grow space. Essentially, computer vision allowed Kindbot to garden visually, more like how a human might read their plants. As a bonus, having a camera meant we could garden remotely, freeing us to worry less when we travel.

We also created nice timelapses, because growing is a matter of pride. And with all of this data, it was easier than ever to relate periods of rapid growth to various conditions. It was easier for us to make an apples-to-apples comparison to earlier grows, ultimately helping us to beat the learning curve in understanding how to make good weed.

Instead of going from memory, we had a real database to query. This ability to playback our grow supported our own experiements to improve the yield and quality as we explored new strains, feeding protocols, irrigation methods, and lighting equipment. Having a full picture of the entire grow helped us track important variations that we couldn't otherwise by simply logging one temperature reading for the entire day/week. Cannabis is a robust plant but we believe quality can be impacted by something as small as losing control of heat for a relatively small part of the lifecycle.

Specializing Vision to Cannabis

After accumulating a large image corpus sourced from the internet, we began to develop models to recognize different cannabis plant structures as well as different stress signs so that our monitor would be best positioned to notify us if there is a problem which requires a human touch. For example, we developed a model which can recognize droopy plants. We used this ability to make adjustments in irrigation. Similarly, our vision models recognize stress signs like yellowing leaves or signs of growth and can track plant development.

Now that our controller/monitor has the extra sense of sight, the question remained "could we enable the device to take actions with this new information?" Instead of reinventing the wheel, we decided to leverage off-the-shelf smart plugs and let Kindbot control garden equipment. This means that with the rich data we're collecting via sensors and camera, Kindbot can control an appliance with the full picture on the environment. With more information, we were able to build high fidelity temperature control. Using the models, we were able to trigger an extra irrigation event when Kindbot detected signs of drought stress by activating the pump's smart plug.

We were impressed with the results of our experiments and we wanted to make this tool as accessible as possible. So we began developing the mobile front-end to simplify the experience we had using a tiny computer, camera, environmental sensor, and accompanying smart strip. With Kindbot as the brain, we control all our dumb, cheap appliances while maintaining the ideal environmental conditions.