Easy to use class balanced cross entropy and focal loss implementation for Pytorch
-
Updated
Dec 17, 2024 - Python
Easy to use class balanced cross entropy and focal loss implementation for Pytorch
FraCTune is a Matlab package for tuning fractional-order controllers with the Cross-Entropy method and augmented Lagrangian formulation.
A Neural Network implementation from scratch built in Rust, designed to work with the MNIST dataset of handwritten digits.
A library for working with 1D piecewise linear probability density functions.
This repository contains my solutions and implementations for assignments assigned during the Machine Learning course.
Towards Generalization in Subitizing with Neuro-Symbolic Loss using Holographic Reduced Representations
Numerically Stable Cross Entropy Loss Function Implemented with Python and Tensorflow
Repo for the Deep Reinforcement Learning Nanodegree program
We apply the noisy cross-entropy method to the game of Tetris to demonstrate its efficiency.
Generic implementations of numerical optimization methods. As of now, only cross-entropy method is here.
Probability and Statistics for Machine Learning
Using the cross-entropy method to solve Frozen Lake.
Emotion detection from small images using CNN
A data classification using MLP
Code for the paper "A unifying mutual information view of metric learning: cross-entropy vs. pairwise losses" (ECCV 2020 - Spotlight)
The Project IsoCrypto was inspired by statistical tools applied in the data mining of astrophysics data. The main references were theoretical curves adjusted in open clusters – these curves are called isochrones. Therefore, this project aims to amplify the applications of those curves in econophysics.
A mini project on MNIST dataset using TensorFlow.
Add a description, image, and links to the cross-entropy topic page so that developers can more easily learn about it.
To associate your repository with the cross-entropy topic, visit your repo's landing page and select "manage topics."