Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to content

Neural network Layers for Time2Vec. TF v.2.16 Compatible. From paper: "Time2Vec: Learning a Vector Representation of Time" - https://arxiv.org/pdf/1907.05321.pdf

License

Notifications You must be signed in to change notification settings

andrewrgarcia/time2vec

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Time2Vec in Keras and Torch

My implementation of Time2Vec: Learning a Vector representation of Time as Keras and PyTorch Layers.

How to Use It

Head straight to our Time2Vec Usage Template in Google Colab

The Concept

Time2Vec offers a versatile representation of time with three fundamental properties. It encapsulates scalar notion of time $\tau$, in $\mathbf{t2v}(\tau)$, a vector of size k + 1. This transformation, for an $i^{th}$ element of $\mathbf{t2v}$, is defined as follows:

$$\mathbf{t2v}(\tau)[i] = \begin{cases} \omega_i \tau + \phi_i, & \mathrm{if} & i = 0.\\\ \mathcal{F}(\omega_i \tau + \phi_i), & \mathrm{if} & 1 \leq i \leq k. \end{cases}$$

The above incorporates a periodic activation function denoted as $\mathcal{F}$, and involves learnable parameters $\omega_i$ and $\phi_i$ [1].

Reference

  1. Seyed Mehran Kazemi, Rishab Goel, Sepehr Eghbali, Janahan Ramanan, Jaspreet Sahota, Sanjay Thakur, Stella Wu, Cathal Smyth, Pascal Poupart, Marcus Brubaker. "Time2Vec: Learning a Vector Representation of Time." arXiv:1907.05321 [cs.LG], 11 Jul 2019. Link