Immagine della notizia

How much information do LLMs really memorize? Now we know, thanks to Meta, Google, Nvidia and Cornell

Date: 2025-06-05 17:35:34

Using a clever solution, researchers find GPT-style models have a fixed memorization capacity of approximately 3.6 bits per parameter.


Sources:

Click and go !

More From:

venturebeat.com