Step 3: Label Mini-scenes with Behavior¶
Overview¶
You can use the KABR model on Hugging Face to label the mini-scenes with behavior. See the ethogram folder for the list of behaviors used to label the zebra videos.
Using the miniscene2behavior Tool¶
Label the mini-scenes using the following command:
miniscene2behavior [--hub huggingface_hub] [--config path_to_config] --checkpoint path_to_checkpoint [--gpu_num number_of_gpus] --miniscene path_to_miniscene [--output path_to_output_csv]
Usage Examples¶
Download checkpoint from Hugging Face and extract config¶
miniscene2behavior --hub imageomics/x3d-kabr-kinetics --checkpoint checkpoint_epoch_00075.pyth.zip --miniscene path_to_miniscene
Download checkpoint and config from Hugging Face¶
miniscene2behavior --hub imageomics/x3d-kabr-kinetics --config config.yml --checkpoint checkpoint_epoch_00075.pyth --miniscene path_to_miniscene
Use local checkpoint and config¶
miniscene2behavior --config config.yml --checkpoint checkpoint_epoch_00075.pyth --miniscene path_to_miniscene
Important Notes¶
GPU Usage
If gpu_num
is 0, the model will use CPU. Using at least 1 GPU greatly increases inference speed. If you're using OSC, you can request a node with one GPU by running:
Input Format
Mini-scenes are clipped videos focused on individual animals and video is the raw video file from which mini-scenes have been extracted.
Resources¶
- Pre-trained KABR model on Hugging Face.
- Ethogram definitions - Behavior classification system used for zebra videos.
- Example annotated outputs on Hugging Face.
Tool Reference¶
miniscene2behavior¶
Source: src/kabr_tools/miniscene2behavior.py
Apply machine learning models to classify animal behaviors from mini-scene videos.
Parameters:
- --hub
: Hugging Face hub repository containing model files.
- --config
: Path to configuration file (local or from hub).
- --checkpoint
: Path to model checkpoint file.
- --gpu_num
: Number of GPUs to use (0 for CPU).
- --miniscene
: Path to mini-scene videos directory.
- --output
: Path for output CSV file (optional).
Next Steps¶
Once you have labeled your mini-scenes with behaviors, proceed to Step 4: Ecological Analysis to generate insights and visualizations from your behavioral data.