A plug-and-play library for neural networks written in Python

Overview

Synapses

A plug-and-play library for neural networks written in Python!

# run
pip install synapses-py==7.4.1
# in the directory of your project

Neural Network

Create a neural network

Import Synapses, call NeuralNetwork.init and provide the size of each layer.

from synapses_py import NeuralNetwork, ActivationFunction, DataPreprocessor, Statistics
layers = [4, 6, 5, 3]
neuralNetwork = NeuralNetwork.init(layers)

neuralNetwork has 4 layers. The first layer has 4 input nodes and the last layer has 3 output nodes. There are 2 hidden layers with 6 and 5 neurons respectively.

Get a prediction

inputValues = [1.0, 0.5625, 0.511111, 0.47619]
prediction = \
        NeuralNetwork.prediction(neuralNetwork, inputValues)

prediction should be something like [ 0.8296, 0.6996, 0.4541 ].

Note that the lengths of inputValues and prediction equal to the sizes of input and output layers respectively.

Fit network

learningRate = 0.5
expectedOutput = [0.0, 1.0, 0.0]
fitNetwork = \
        NeuralNetwork.fit(
            neuralNetwork,
            learningRate,
            inputValues,
            expectedOutput
        )

fitNetwork is a new neural network trained with a single observation.

To train a neural network, you should fit with multiple datapoints

Create a customized neural network

The activation function of the neurons created with NeuralNetwork.init, is a sigmoid one. If you want to customize the activation functions and the weight distribution, call NeuralNetwork.customizedInit.

def activationF(layerIndex):
    if layerIndex == 0:
        return ActivationFunction.sigmoid
    elif layerIndex == 1:
        return ActivationFunction.identity
    elif layerIndex == 2:
        return ActivationFunction.leakyReLU
    else:
        return ActivationFunction.tanh

def weightInitF(_layerIndex):
    return 1.0 - 2.0 * random()

customizedNetwork = \
        NeuralNetwork.customizedInit(
            layers,
            activationF,
            weightInitF
        )

Visualization

Call NeuralNetwork.toSvg to take a brief look at its svg drawing.

Network Drawing

The color of each neuron depends on its activation function while the transparency of the synapses depends on their weight.

svg = NeuralNetwork.toSvg(customizedNetwork)

Save and load a neural network

JSON instances are compatible across platforms! We can generate, train and save a neural network in Python and then load and make predictions in Javascript!

toJson

Call NeuralNetwork.toJson on a neural network and get a string representation of it. Use it as you like. Save json in the file system or insert into a database table.

json = NeuralNetwork.toJson(customizedNetwork)

ofJson

loadedNetwork = NeuralNetwork.ofJson(json)

As the name suggests, NeuralNetwork.ofJson turns a json string into a neural network.

Encoding and decoding

One hot encoding is a process that turns discrete attributes into a list of 0.0 and 1.0. Minmax normalization scales continuous attributes into values between 0.0 and 1.0. You can use DataPreprocessor for datapoint encoding and decoding.

The first parameter of DataPreprocessor.init is a list of tuples (attributeName, discreteOrNot).

setosaDatapoint = {
    "petal_length": "1.5",
    "petal_width": "0.1",
    "sepal_length": "4.9",
    "sepal_width": "3.1",
    "species": "setosa"
}

versicolorDatapoint = {
    "petal_length": "3.8",
    "petal_width": "1.1",
    "sepal_length": "5.5",
    "sepal_width": "2.4",
    "species": "versicolor"
}

virginicaDatapoint = {
    "petal_length": "6.0",
    "petal_width": "2.2",
    "sepal_length": "5.0",
    "sepal_width": "1.5",
    "species": "virginica"
}

datasetList = [ setosaDatapoint,
                versicolorDatapoint,
                virginicaDatapoint ]

dataPreprocessor = \
        DataPreprocessor.init(
             [ ("petal_length", False),
               ("petal_width", False),
               ("sepal_length", False),
               ("sepal_width", False),
               ("species", True) ],
             iter(datasetList)
        )

encodedDatapoints = map(lambda x:
        DataPreprocessor.encodedDatapoint(dataPreprocessor, x),
        datasetList
)

encodedDatapoints equals to:

[ [ 0.0     , 0.0     , 0.0     , 1.0     , 0.0, 0.0, 1.0 ],
  [ 0.511111, 0.476190, 1.0     , 0.562500, 0.0, 1.0, 0.0 ],
  [ 1.0     , 1.0     , 0.166667, 0.0     , 1.0, 0.0, 0.0 ] ]

Save and load the preprocessor by calling DataPreprocessor.toJson and DataPreprocessor.ofJson.

Evaluation

To evaluate a neural network, you can call Statistics.rootMeanSquareError and provide the expected and predicted values.

expectedWithOutputValuesList = \
        [ ( [ 0.0, 0.0, 1.0], [ 0.0, 0.0, 1.0] ),
          ( [ 0.0, 0.0, 1.0], [ 0.0, 1.0, 1.0] ) ]

expectedWithOutputValuesIter = \
        iter(expectedWithOutputValuesList)

rmse = Statistics.rootMeanSquareError(
                        expectedWithOutputValuesIter
)
Owner
Dimos Michailidis
Dimos Michailidis
Code artifacts for the submission "Mind the Gap! A Study on the Transferability of Virtual vs Physical-world Testing of Autonomous Driving Systems"

Code Artifacts Code artifacts for the submission "Mind the Gap! A Study on the Transferability of Virtual vs Physical-world Testing of Autonomous Driv

Andrea Stocco 2 Aug 24, 2022
DL course co-developed by YSDA, HSE and Skoltech

Deep learning course This repo supplements Deep Learning course taught at YSDA and HSE @fall'21. For previous iteration visit the spring21 branch. Lec

Yandex School of Data Analysis 1.3k Dec 30, 2022
PerfFuzz: Automatically Generate Pathological Inputs for C/C++ programs

PerfFuzz Performance problems in software can arise unexpectedly when programs are provided with inputs that exhibit pathological behavior. But how ca

Caroline Lemieux 125 Nov 18, 2022
Python TFLite scripts for detecting objects of any class in an image without knowing their label.

Python TFLite scripts for detecting objects of any class in an image without knowing their label.

Ibai Gorordo 42 Oct 07, 2022
PyTorch Implementation of the SuRP algorithm by the authors of the AISTATS 2022 paper "An Information-Theoretic Justification for Model Pruning"

PyTorch Implementation of the SuRP algorithm by the authors of the AISTATS 2022 paper "An Information-Theoretic Justification for Model Pruning".

Berivan Isik 8 Dec 08, 2022
This repository provides a basic implementation of our GCPR 2021 paper "Learning Conditional Invariance through Cycle Consistency"

Learning Conditional Invariance through Cycle Consistency This repository provides a basic TensorFlow 1 implementation of the proposed model in our GC

BMDA - University of Basel 1 Nov 04, 2022
Lucid Sonic Dreams syncs GAN-generated visuals to music.

Lucid Sonic Dreams Lucid Sonic Dreams syncs GAN-generated visuals to music. By default, it uses NVLabs StyleGAN2, with pre-trained models lifted from

731 Jan 02, 2023
Fastshap: A fast, approximate shap kernel

fastshap: A fast, approximate shap kernel fastshap was designed to be: Fast Calculating shap values can take an extremely long time. fastshap utilizes

Samuel Wilson 22 Sep 24, 2022
Self-Supervised Learning for Domain Adaptation on Point-Clouds

Self-Supervised Learning for Domain Adaptation on Point-Clouds Introduction Self-supervised learning (SSL) allows to learn useful representations from

Idan Achituve 66 Dec 20, 2022
A PyTorch re-implementation of Neural Radiance Fields

nerf-pytorch A PyTorch re-implementation Project | Video | Paper NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis Ben Mildenhall

Krishna Murthy 709 Jan 09, 2023
SpeechBrain is an open-source and all-in-one speech toolkit based on PyTorch.

The SpeechBrain Toolkit SpeechBrain is an open-source and all-in-one speech toolkit based on PyTorch. The goal is to create a single, flexible, and us

SpeechBrain 5.1k Jan 02, 2023
Audio Visual Emotion Recognition using TDA

Audio Visual Emotion Recognition using TDA RAVDESS database with two datasets analyzed: Video and Audio dataset: Audio-Dataset: https://www.kaggle.com

Combinatorial Image Analysis research group 3 May 11, 2022
Unified Pre-training for Self-Supervised Learning and Supervised Learning for ASR

UniSpeech The family of UniSpeech: UniSpeech (ICML 2021): Unified Pre-training for Self-Supervised Learning and Supervised Learning for ASR UniSpeech-

Microsoft 282 Jan 09, 2023
The pyrelational package offers a flexible workflow to enable active learning with as little change to the models and datasets as possible

pyrelational is a python active learning library developed by Relation Therapeutics for rapidly implementing active learning pipelines from data management, model development (and Bayesian approximat

Relation Therapeutics 95 Dec 27, 2022
Deep Illuminator is a data augmentation tool designed for image relighting. It can be used to easily and efficiently generate a wide range of illumination variants of a single image.

Deep Illuminator Deep Illuminator is a data augmentation tool designed for image relighting. It can be used to easily and efficiently generate a wide

George Chogovadze 52 Nov 29, 2022
Low Complexity Channel estimation with Neural Network Solutions

Interpolation-ResNet Invited paper for WSA 2021, called 'Low Complexity Channel estimation with Neural Network Solutions'. Low complexity residual con

Dianxin 10 Dec 10, 2022
Hummingbird compiles trained ML models into tensor computation for faster inference.

Hummingbird Introduction Hummingbird is a library for compiling trained traditional ML models into tensor computations. Hummingbird allows users to se

Microsoft 3.1k Dec 30, 2022
implementation of the paper "MarginGAN: Adversarial Training in Semi-Supervised Learning"

MarginGAN This repository is the implementation of the paper "MarginGAN: Adversarial Training in Semi-Supervised Learning". 1."preliminary" is the imp

Van 7 Dec 23, 2022
Code for this paper The Lottery Ticket Hypothesis for Pre-trained BERT Networks.

The Lottery Ticket Hypothesis for Pre-trained BERT Networks Code for this paper The Lottery Ticket Hypothesis for Pre-trained BERT Networks. [NeurIPS

VITA 122 Dec 14, 2022