-
Notifications
You must be signed in to change notification settings - Fork 2.9k
Open
Open
[BUG] Incorrect context length while using with LM studio#11924
Description
Problem (one or two sentences)
There is huge mismatch in context length shown by Roo code extention and the actual context length in roo code.
As shown by Roo code:
Logs in LM studio:
Context (who is affected and when)
Happens while using it with LM studio and Qwen 3.5 35B-A3B
Reproduction steps
While using the Roo code as intended, slowly the difference between the estimated tokens start getting more and more.
Expected result
The token calculation in extension should match the actual token count
Actual result
Different token count in Roo code and LM studio
Variations tried (optional)
No response
App Version
Version: 3.51.1 (7c9722b)
API Provider (optional)
LM Studio
Model Used (optional)
Qwen 3.5 35B-A3B
Roo Code Task Links (optional)
No response