The CommuniCity Toolbox is a set of open-source components, services and implementations based on the MIMs, developed as part of the CommuniCity Project.
The project's documentation can be found here: https://communicity-docs.readthedocs.io/
The Toolbox project aims to provide a set of software components, services and approaches to support the design and development of innovative solutions inside the CommuniCity Project. The main objective of the toolbox is, on the one hand, to enable developers to speed up and simplify the implementation of city solutions, and on the other hand, to provide tools that enable the interoperability between existing systems and developers' solutions and ensure the replicability of the developed applications.
More precisely, the Toolbox offers the following main features:
- A set of ready-to-use AI solutions that use an NGSI-LD context broker to get and post data based on well-defined data models.
- Specifications of the data models used.
- Full support for the first three MIMs.
- A collection of machine learning models spanning various domains.
- A collection of tools and implementations that help developers build MIM-based solutions.
Minimal Interoperability Mechanisms (MIMs) are universal tools for achieving interoperability of data, systems, and services between cities and suppliers around the world. MIMs are based on a set of baseline specifications and references related to the different cities and communities that are part of the OASC network.
The current version of the Toolbox supports the following reference implementations of the first three MIMs. Future improvements of the Toolbox will offer availability of more reference implementations and support for new MIMs:
Context information contains comprehensive status information about real-world entities defined in a structured way with a formal definition. The Toolbox components are designed to produce context data based on the NGSI-LD standard which can then be published to a context broker, making it available to any application. This approach enables an efficient way of retrieving and publishing data, utilizing a well-known standard and unifying the output of each component on a single common endpoint.
The proposed implementation is Orion-LD Context Broker, a Context Broker for context data management that supports both the NGSI-LD and the NGSI-v2.
Each of the Toolbox components generates its own entities with a specific structure and data types for the task it is performing. To formally define the component's output a set of data models is created. These models capture the purpose of each task and describe the attributes that it must hold.
The Smart Data Models initiative is the baseline repository for common data models to be used in CommuniCity. In addition, new data models are created to fit the Toolbox requirements.
A standardized data marketplace is used to expose the data generated by the Toolbox components along with the data models used, allowing the interoperability of different applications and systems. This marketplace is used as a portal to discover and access the offered services in a secure and private way.
The implementation used in this project is the Business API Ecosystem. It provides capabilities for managing, publishing, and generating revenue from different kinds of assets (both digital and physical) across the whole service life cycle.
The Toolbox Projects are concrete implementations of some of the Toolbox components that are designed to be used by end users or applications. They also serve as a showcase of the functionalities of the Toolbox and provide practical examples of how to use the components to build real-world applications.
Projects work in conjunction with a context broker, which acts as a mediator between the Projects and the data sources. The input data is received from the context broker, which is then processed to generate an output that is posted back to the context broker. This generated data uses the NGSI-LD format and follows the data models defined by the Toolbox.
Each Project comes with a REST API already implemented that allows it to be executed as a service. A docker-compose file is also provided to orchestrate the deployment of multiple services simultaneously.
The following table contains the currently provided Projects:
Name | Description |
---|---|
FaceDetection | Detect faces in images |
FaceRecognition | Detect and extract features of faces, create a facial dataset and recognize faces |
AgeGender | Detect faces, predict their gender and estimate their age |
InstanceSegmentation | Perform instance segmentation on images |
Keypoints | Predicts the position of body key points |
FaceEmotions | Classify different types of face expressions |
ImageStorage | API to upload and download images and visualize the data generated by the Toolbox |
FrontEnd | Web UI to showcase the Toolbox Project APIs and components |
More details about Projects can be found here.
The Toolbox provides a collection of machine learning models spanning various domains and performance levels. These models are used by the Toolbox Projects but can also be used out of the box as Python modules. They are interchangeable, implementing the same methods and data formats so different models of the same domain can be used depending on the hardware or performance needs.
A complete list of the currently implemented models can be found here.
The Toolbox provides a set of standardized data models that define the structure and content of the data generated and consumed by the Toolbox components. Also, components to interact with the context broker and parse its data are provided.
The data model specifications can be found here.
The overall Toolbox architecture is depicted in the following diagram:
Here, multiple Projects are executed on individual containers, each one mainly formed by a machine learning model, a base class and a REST API. The API communicates with users/apps to execute the Project and at the same time, it posts and retrieves data models from a context broker as the input and output data of the service. The users/apps will receive the output data directly from the API or by querying the context broker. A special service called ImageStorage is developed to serve as temporary storage for the input and output images. The context broker is the chosen component to manage the data used by Toolbox but it is not intended to store large files. For this reason, the ImageStorage is used by users/apps and other Toolbox Projects to store and retrieve images, but its use is not mandatory and images from other sources can be used as well.
The Toolbox can be installed manually as a Python package or it can be used with the provided Docker image and Docker compose:
-
To install the Toolbox as a Python package, refer to the installation guide.
-
To use the Toolbox with Docker see the Docker guide.
Contributions are always welcome. Feel free to submit issues, feature requests, or pull requests.
This project is licensed under the MIT License.