A custom Tensorflow implementation of Google's Electra NLP model with compositional embeddings using complementary partitions
-
Updated
May 1, 2021 - Jupyter Notebook
A custom Tensorflow implementation of Google's Electra NLP model with compositional embeddings using complementary partitions
This is the source code of article how to create a chatbot in python . i.e A chatbot using the Reformer, also known as the efficient Transformer, to generate dialogues between two bots.
Master thesis with code investigating methods for incorporating long-context reasoning in low-resource languages, without the need to pre-train from scratch. We investigated if multilingual models could inherit these properties by making it an Efficient Transformer (s.a. the Longformer architecture).
Pytorch implementation of LISA (Linear-Time Self Attention with Codeword Histogram for Efficient Recommendation. WWW 2021)
Official PyTorch Implementation of Long-Short Transformer (NeurIPS 2021).
Official PyTorch implementation of our ECCV 2022 paper "Sliced Recursive Transformer"
Mask Transfiner for High-Quality Instance Segmentation, CVPR 2022
[NeurIPS 2022 Spotlight] This is the official PyTorch implementation of "EcoFormer: Energy-Saving Attention with Linear Complexity"
[CVPR 2023] IMP: iterative matching and pose estimation with transformer-based recurrent module
Demo code for CVPR2023 paper "Sparsifiner: Learning Sparse Instance-Dependent Attention for Efficient Vision Transformers"
Implementation of the Transformer variant proposed in "Transformer Quality in Linear Time"
[ICCV 2023] Efficient Video Action Detection with Token Dropout and Context Refinement
Gated Attention Unit (TensorFlow implementation)
[MICCAI 2023] DAE-Former: Dual Attention-guided Efficient Transformer for Medical Image Segmentation
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
[ICLR 2022] "Unified Vision Transformer Compression" by Shixing Yu*, Tianlong Chen*, Jiayi Shen, Huan Yuan, Jianchao Tan, Sen Yang, Ji Liu, Zhangyang Wang
Nonparametric Modern Hopfield Models
This repository contains the official code for Energy Transformer---an efficient Energy-based Transformer variant for graph classification
Official Implementation of Energy Transformer in PyTorch for Mask Image Reconstruction
[CVPR 2022--Oral] Restormer: Efficient Transformer for High-Resolution Image Restoration. SOTA for motion deblurring, image deraining, denoising (Gaussian/real data), and defocus deblurring.
Add a description, image, and links to the efficient-transformers topic page so that developers can more easily learn about it.
To associate your repository with the efficient-transformers topic, visit your repo's landing page and select "manage topics."