Skip to content

asgutierrt/Introduction_to_IA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multi-Layer Perceptron (MLP)

Learning exercise about the training of a multilayer perceptron (MLP), a type of artificial neural network. It also encompasses the development of criteria to confidently make deductions about the training process and characteristics. The methodology called gradient descent looks to minimize errors in the modeled output in an iterative way. Additionally, we correlate the results to the learning problem that motivates the use of a MLP model. The work is based on non-standard data provided as part of our course, giving us a chance to put our theoretical knowledge into practice.

Hybrid NN Applications

This section explores the use of NN in classification, regression, and feature extraction. It implements autoencoders to manipulate data dimensionality and assess their impact on regression tasks. The CNN LeNET5’s application for MNIST dataset classification, and a validation on dataset created in the classroom. Additionally, Generative Adversarial Networks (GANs) are used for their generative and discriminative modeling, once again validated using the classroom digits dataset. One last network, Style-GAN, explores image generation and modification.

* Autoencoders and Convolutional Neural Networks (LeNet5) * Generative adversarial networks and style transfer

Unsupervised Classification

This section employs a series of distance, neighborhood, and density-based algorithms to the classification of datasets. The classification outcomes are assessed using a range of internal validation indices, and applied on Autoencoder and UMAP manipulations of the original data. Density algorithms are used to find the correct number of clusters, and they are the starting point for Neighborhood based algorithms.

About

LNET5 training for IA exercise - Eafit

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published