نظرة عامة
رصد مجتمع Hacker News هذا الخبر الذي حصد 8 نقطة و3 تعليق خلال ساعات قليلة، مما يجعله من أبرز أخبار الذكاء الاصطناعي اليوم. المصدر الأصلي: github.com.
في هذا المقال نستعرض أبرز ما جاء في هذا الخبر، تحليله من منظور عربي، وما يعنيه للمستخدمين العرب المهتمين بأدوات الذكاء الاصطناعي.
التفاصيل
Most RAG setups fail because they treat memory like a static filing cabinet. When every transient bug fix or abandoned rule is stored forever, the context window eventually chokes on noise, spiking token costs and degrading the agent's reasoning.<p>This implementation experiments with a biological approach by using the Ebbinghaus forgetting curve to manage context as a living substrate. Memories are assigned a "strength" score where each recall reinforces the data and flattens its decay curve (spaced repetition), while unused data eventually hits a threshold and is pruned.<p>To solve the "logical neighbor" problem where semantic search misses relevant but non-similar nodes, a graph layer is layered over the vector store. Benchmarked against the LoCoMo dataset, this reached 52% Recall@5, nearly double the accuracy of stateless vector stores, while cutting token waste by roughly 84%.<p>Built as a local first MCP server using DuckDB, the hypothesis is that for agents handling long-running projects, "what to forget" is just as critical as "what to remember." I'd be interested to hear if others are exploring non-linear decay or similar biological constraints for context management.<p>GitHub: <a href="https://github.com/sachitrafa/cognitive-ai-memory" rel="nofollow">https://github.com/sachitrafa/cognitive-ai-memory</a>
المصدر الأصلي
هذا الخبر مأخوذ من منصة Hacker News — المجتمع التقني الأكثر متابعة في العالم.