Skip to content

Use fused softmax kernel in llama attention layer #3584

Use fused softmax kernel in llama attention layer

Use fused softmax kernel in llama attention layer #3584

Test candle-book

succeeded Oct 23, 2024 in 1m 39s