Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transformer Architectures #34

Open
ghost opened this issue Feb 23, 2020 · 2 comments
Open

Transformer Architectures #34

ghost opened this issue Feb 23, 2020 · 2 comments
Assignees
Labels

Comments

@ghost
Copy link

ghost commented Feb 23, 2020

Title

Introduction to Transformers and its various descendants like BERT, GPT-2, XLNet, etc

Description

I will be introducing the Transformer architecture in full from its base(encoder, decoder, and self-attention layer) and then if time permits I would like to talk about its descendants like BERT, GPT-2, XLNet, etc.

Duration

  • 45 min

Audience

A simple understanding of Neural Networks

Outline

A detailed outline for my talk is described in my medium articles for the same:
https://medium.com/@tejanm

Additional notes

Here is a link to my medium articles that I will be referring to:
https://medium.com/@tejanm


  • [Yes] Do you require internet for the presentation?
  • [Ok] Do you want your talk to be recorded?
@TrigonaMinima
Copy link

@tejanmehndiratta are you available to speak on 29th?

@ghost
Copy link
Author

ghost commented Feb 24, 2020 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants