Sharing Less Data Could Make AI Models Smarter and Greener
Ben Sullivan
One tenth of one percent. That is all it takes. Out of billions of parameters inside a large language model, only a tiny, specially chosen sliver, roughly 0.1%, carries the information that actually matters when the model needs to learn something new. The rest is, in a sense, dead weight during the update. Recognising this has led a team of researchers at Stevens Institute of Technology to an algorithm they call MEERKAT, which might, perhaps more quietly than most AI breakthroughs, change how...
