Transfer learning applied to flower recognition 

Fork on GitHub Read the tutorial

Data collection and annotation are often tedious and time-consuming to achieve satisfactory results in image classification. The transfer learning technique enables to overcome this challenge by reducing the number of images and the training time needed to add a new class. It is applied here to flower classification, but it can be extended to many other use cases. 


  • The STM32 model zoo provides everything you need to train and retrain on your own data
  • The solution uses a technique called “transfer learning” to quickly train a deep learning model to classify images
  • The model can be easily deployed on the STM32H747 discovery kit with the STM32 model zoo Python scripts
  • The use case presented is based on flower recognition.


Vision: Camera module bundle (reference: B-CAMS-OMV)


Dataset: Dataset for flower recognition (License CC BY 2.0)
Data format:
5 classes of flowers: daisy, dandelion, rose, sunflower, tulip 
RGB color images 


Model: MobileNetV2 alpha 0.35 

Input size: 128x128x3

Memory footprint:
406.86 KB Flash for weights
224.5 KB RAM for activations

Float model: 86.78%
Quantized model: 86.38% 

Performance on STM32H747 (High-perf) @ 400 MHz 
Inference time: 110.27 ms
Frame rate: 9.0 fps

STM32Cube.AI flower recognition use case updated confusion matrix
Confusion matrix