Skip to content

Actions: Dobiasd/frugally-deep

Actions

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
185 workflow runs
185 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Revert debug changes in application_performance.cpp
ci #563: Commit 2479061 pushed by Dobiasd
January 1, 2024 09:25 9m 23s master
January 1, 2024 09:25 9m 23s
Remove redundant internal function subtract_tensor
ci #562: Commit 646c71d pushed by Dobiasd
January 1, 2024 09:15 9m 23s master
January 1, 2024 09:15 9m 23s
Merge branch 'master' of https://github.com/Dobiasd/frugally-deep
ci #561: Commit f82b6fd pushed by Dobiasd
January 1, 2024 09:14 9m 19s master
January 1, 2024 09:14 9m 19s
Update TensorFlow to version 2.15.0
ci #560: Commit 5b44839 pushed by Dobiasd
January 1, 2024 09:09 8m 48s master
January 1, 2024 09:09 8m 48s
Update TensorFlow to version 2.15.0
ci #559: Pull request #410 opened by Dobiasd
January 1, 2024 08:44 8m 50s tensorflow-2-15-0
January 1, 2024 08:44 8m 50s
Update TensorFlow to version 2.15.0
ci #558: Commit c0c060e pushed by Dobiasd
January 1, 2024 08:44 10m 4s tensorflow-2-15-0
January 1, 2024 08:44 10m 4s
Bump version to 0.15.30
ci #557: Commit 6e673fb pushed by Dobiasd
December 31, 2023 18:11 9m 4s v0.15.30
December 31, 2023 18:11 9m 4s
Bump version to 0.15.30
ci #556: Commit 6e673fb pushed by Dobiasd
December 31, 2023 18:10 9m 30s master
December 31, 2023 18:10 9m 30s
Add MultiHeadAttention layer (#392)
ci #555: Commit 7104dd0 pushed by Dobiasd
December 31, 2023 18:09 9m 43s master
December 31, 2023 18:09 9m 43s
Add MultiHeadAttention layer
ci #554: Pull request #392 synchronize by Dobiasd
December 31, 2023 17:58 9m 19s multi-head-attention
December 31, 2023 17:58 9m 19s
Add MultiHeadAttention layer
ci #552: Pull request #392 synchronize by Dobiasd
December 31, 2023 17:54 9m 19s multi-head-attention
December 31, 2023 17:54 9m 19s
Revert debug output
ci #551: Commit b35deb9 pushed by Dobiasd
December 31, 2023 17:54 7m 31s multi-head-attention
December 31, 2023 17:54 7m 31s
Add MultiHeadAttention layer
ci #550: Pull request #392 synchronize by Dobiasd
December 31, 2023 17:52 9m 37s multi-head-attention
December 31, 2023 17:52 9m 37s
double-check weights shapes
ci #549: Commit 41ac53a pushed by Dobiasd
December 31, 2023 17:52 8m 57s multi-head-attention
December 31, 2023 17:52 8m 57s
Add MultiHeadAttention layer
ci #548: Pull request #392 synchronize by Dobiasd
December 31, 2023 17:43 9m 19s multi-head-attention
December 31, 2023 17:43 9m 19s
Do not pass unused attention_axes
ci #547: Commit a95abf4 pushed by Dobiasd
December 31, 2023 17:43 9m 15s multi-head-attention
December 31, 2023 17:43 9m 15s
Add MultiHeadAttention layer
ci #546: Pull request #392 synchronize by Dobiasd
December 31, 2023 12:58 9m 23s multi-head-attention
December 31, 2023 12:58 9m 23s
Check for attention_axes=None in conversion
ci #545: Commit fd6e7c4 pushed by Dobiasd
December 31, 2023 12:58 9m 10s multi-head-attention
December 31, 2023 12:58 9m 10s
Add MultiHeadAttention layer
ci #544: Pull request #392 synchronize by Dobiasd
December 31, 2023 12:55 9m 6s multi-head-attention
December 31, 2023 12:55 9m 6s
remove todo comment
ci #543: Commit 0d2be86 pushed by Dobiasd
December 31, 2023 12:55 8m 59s multi-head-attention
December 31, 2023 12:55 8m 59s
Add MultiHeadAttention layer
ci #542: Pull request #392 synchronize by Dobiasd
December 31, 2023 12:50 9m 56s multi-head-attention
December 31, 2023 12:50 9m 56s
remove debug tests
ci #541: Commit 0c6cc0a pushed by Dobiasd
December 31, 2023 12:50 8m 59s multi-head-attention
December 31, 2023 12:50 8m 59s
Add MultiHeadAttention layer
ci #540: Pull request #392 synchronize by Dobiasd
December 31, 2023 12:50 9m 1s multi-head-attention
December 31, 2023 12:50 9m 1s