Zetaphor@zemmy.cc to LocalLLaMA@sh.itjust.worksEnglish · 1 year agoDistilling step-by-step: Outperforming larger language models with less training data and smaller model sizesblog.research.googleexternal-linkmessage-square3fedilinkarrow-up128arrow-down10cross-posted to: [email protected][email protected][email protected]
arrow-up128arrow-down1external-linkDistilling step-by-step: Outperforming larger language models with less training data and smaller model sizesblog.research.googleZetaphor@zemmy.cc to LocalLLaMA@sh.itjust.worksEnglish · 1 year agomessage-square3fedilinkcross-posted to: [email protected][email protected][email protected]
minus-squarenoneabove1182@sh.itjust.worksMlinkfedilinkEnglisharrow-up1·1 year agoSomehow this is even more confusing because that code hasn’t been touched in 3 months, maybe just took them that long to validate? Will have to read through it, thanks!
Somehow this is even more confusing because that code hasn’t been touched in 3 months, maybe just took them that long to validate? Will have to read through it, thanks!