But interestingly every now and then I look at the compaction result and it now says if you need to reference the previous conversation you can open <file>. So technically that context is connected.
I’ve noticed MCPs get unstable after compaction. but even that’s been less so lately.
Earlier (29 points, 4 comments) https://news.ycombinator.com/item?id=47367129
I mentioned this at work but context still rots at the same rate. 90k tokens consumed has just as bad results in 100k context window or 1M.
Personally, I’m on a 6M+ like codebase and had no problems with the old window.
CC seems to have gotten pretty good with auto compacting and continuing where it left off. Are there any good use cases for this?
I guess it would be to avoid tool use?
[flagged]
But interestingly every now and then I look at the compaction result and it now says if you need to reference the previous conversation you can open <file>. So technically that context is connected.
I’ve noticed MCPs get unstable after compaction. but even that’s been less so lately.
[flagged]