Terrafarm is an autonomous farming solution provides a comprehensive way to monitor crops at any scale. We provide farmers with the ability to scrutinize every square inch of their fields for a wide range of issues. By detecting crop diseases before they spread, Terrafarm can reduce the usage of harmful chemicals by up to 90% and eradicate invasive species regionally. As the application provides health reports, farmers can optimize fertilizer use and reduce preventive pesticide, herbicide, and fungicide use.
Want to know more? Read our wiki here.
Our code can be found in the src
directory. Read below to learn how to explore, run, and modify the backend and frontend, or play with the notebooks in the notebooks
directory.
The backend comprises of the image processing pipleline that processes mutlispectral images from farms. You can run it locally, or remotely on GCP (in a container). If you'd like to know more about the pipeline, read our wiki here.
Run the image processing pipeline locally. Tested on linux (Ubuntu 20
) and Mac (Ventura 13
). Components that do not involve ML training can also be run on Windows 10
.
-
Install Python 3.10
-
Clone the repo
git clone https://github.com/GDSC-Delft-Dev/apa.git
Note that this might take a while.
- Setup the Python virtual environment
pip install virtualenv
virtualenv env
source env/bin/activate (linux, mac)
source env/Scripts/activate (windows)
- Install Python requirements
cd src/backend
pip install -r requirements.txt
- Run the pipeline
py main.py
The supported arguments for main.py
are:
mode
(local
/cloud
) - specify if the input images are already in the cloud or need to be uploaded first from the local filesystempath
- path to the input images, relative to the local/cloud rootname
- a unique name for the created job
Run the pipeline with images already in the cloud:
py main.py --path path/to/images --mode cloud
Run the pipeline with images on your local filesystem:
py main.py --path path/to/images --mode local
To use infrastructure, please request the GCP service account key at [email protected]
.
- Clone the repo
git clone https://github.com/GDSC-Delft-Dev/apa.git
Note that this might take a while.
- Set the GCP service account environmental variable
export GCP_FA_PRIVATE_KEY=<key> (linux, mac)
set GCP_FA_PRIVATE_KEY=<key> (windows)
- Trigger the pipeline
Manual triggers allow you to run the latest pipeline builds from the Artifact Registry with custom input data using Cloud Run. You can run a job with either input data from your local file system or input data that already resides in the cloud.
cd src/backend
sudo chmod +x trigger.sh
./trigger.sh
The supported arguments for trigger.sh
are:
l
- path to the local imagesc
- path to the images on the cloud (Cloud Storage)n
- a unique name for the pipeline job
Note that local inputs are first copied to a staging directory in Cloud Storage, and will only be removed if the job succeeeds.
Provide input data from a local filesystem
./trigger.sh -l /path/to/data/ -n name-of-the-job
Provide input data from Cloud Storage
./trigger.sh -c /path/to/data/ -n name-of-the-job
To executed the automated tests, run pytest
unit tests:
python -m pytest
You can find our tests in src\backend\pipeline\test\unit
.
Our project uses mypy
and pylint
to assert the quality of the code. You can run these with:
python -m mypy . --explicit-package-bases
python -m pylint ./pipeline
The CI/CD pushes the build from the latest commit to the pipelines-dev
repository in the Google Artifact Registry. Note that only the backend is covered.
You can find the pipeline declaration in .github\workflows\pipeline.yml
.
Please refer to apa/src/frontend/README.md
.
Anyone who is eager to contribute to this project is very welcome to do so. Simply take the following steps:
- Fork the project
- Create your own feature branch
- Commit your changes
- Push to the
dev
branch and open a PR
You can play with the datasets in the notebooks
folder.
Distributed under the MIT License. See LICENSE.txt
for more information.
- Google Developers Student Club Delft - [email protected]
- Paul Misterka - [email protected]
- Mircea Lica - [email protected]
- David Dinucu-Jianu - [email protected]
- Nadine Kuo - [email protected]