☆ Yσɠƚԋσʂ ☆@lemmy.ml to Technology@lemmy.mlEnglish · 5 months agoResearchers upend AI status quo by eliminating matrix multiplication in LLMsarstechnica.comexternal-linkmessage-square4fedilinkarrow-up18arrow-down15cross-posted to: technology@lemmygrad.ml
arrow-up13arrow-down1external-linkResearchers upend AI status quo by eliminating matrix multiplication in LLMsarstechnica.com☆ Yσɠƚԋσʂ ☆@lemmy.ml to Technology@lemmy.mlEnglish · 5 months agomessage-square4fedilinkcross-posted to: technology@lemmygrad.ml
minus-squaretheshatterstone54@feddit.uklinkfedilinkarrow-up2·edit-25 months agoWhy are people downvoting? This is huge and should make LLMs more power efficient and memory efficient.
minus-square☆ Yσɠƚԋσʂ ☆@lemmy.mlOPlinkfedilinkarrow-up1arrow-down1·5 months agoIndeed, this seems like a big step forward, and here’s a link to the model https://github.com/ridgerchu/matmulfreellm
Why are people downvoting? This is huge and should make LLMs more power efficient and memory efficient.
Indeed, this seems like a big step forward, and here’s a link to the model https://github.com/ridgerchu/matmulfreellm