Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
-
Updated
Apr 24, 2023 - Jupyter Notebook
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
A curated list of early exiting
Code for paper "EdgeKE: An On-Demand Deep Learning IoT System for Cognitive Big Data on Industrial Edge Devices"
Official repository of Busolin et al., "Learning Early Exit Strategies for Additive Ranking Ensembles", ACM SIGIR 2021.
Code for "Apparate: Rethinking Early Exits to Tame Latency-Throughput Tensions in ML Serving" [SOSP '24]
Official PyTorch implementation of "LGViT: Dynamic Early Exiting for Accelerating Vision Transformer" (ACM MM 2023)
The project aim to experiment implementing a modular architecture: an early-exit model and testing it using Tensorflow.
C implementation of a SHA-1 cracker with various optimizations
The aim of this work is to build a model capable of classifying diseases in corn leaves. The classes are four: Common Rust, Gray Leaf Spot, Blight, and Healthy. Three different CNN-based models are employed, with the introduction of early exit layers in the last one.
This repository is dedicated to self-learning about early exit papers, including relevant code and documentation.
Add a description, image, and links to the early-exit topic page so that developers can more easily learn about it.
To associate your repository with the early-exit topic, visit your repo's landing page and select "manage topics."