Skip to content

My implementation of the transformer architecture from the Attention is All you need paper applied to time series.

Notifications You must be signed in to change notification settings

dream857/Pytorch-Transfomer

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This is an implementation of the Transformer algorithm on time series data in pytorch. In this case the modelling of the sigmoid function is used as a toy problem

Usage:
First all the necessary imports as well as matplotlib for visualisation.

Next we need to define some hyperparameters which will vary depending on the task.

We initilisise the Network and an optimizier, in this case Adam, as well as an empty list to track losses for visualisation.

Using matplotlib in jupyter notebook we can graph losses in real time, first lets initialise a figure.

We can now being training

You should see a live plot that looks similar to this tracking the ouput error

Now that the network is trained, lets give it the first few values of the sigmoid function and see how it approximates the rest.
We create another figure to visualise this.

If all went well, the output should look something like this :

Note that the network uses past values instead of the x axis for its predictions , so it makes sense that the output is offset. However it did succesfully capture the shape.

Resources:

About

My implementation of the transformer architecture from the Attention is All you need paper applied to time series.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 95.9%
  • Python 4.1%