Gradient background
Continue Logo

Changelog


Faster agent mode and autocomplete

⚡️ Faster agent mode

By avoiding full-file rewrites and using AST-based methods for deterministically applying code, agent mode can quickly make targeted edits to multiple-hundred-line files. See an example here

💨 mercury-coder-small

Released by Inception Labs, mercury-coder-small is a diffusion-based language model trained for nearly instant autocomplete suggestions. It is now available and ready for use in Continue

This release also includes many UI improvements to Edit mode and tool outputs


Automatic rule creation and fast apply

📝 Automatic rule creation

Continue can now write rules for you! Just ask while in agent mode and it will automatically generate the file in ~/.continue/rules

⚡️ Fast apply

Continue now supports fast apply via various models on the hub, including Relace Instant Apply and Morph v0.


The "notch", LLM logs, and improved tools

⚙️ The "notch"

The "notch" is an easy-access control panel for your custom assistant, right above the chat input box. Toggle rules, bookmark prompts, allow tools, and manage your blocks without leaving the current session.

🔍 LLM logs

Thanks to a great community contribution, Continue comes with a purpose-built UI for viewing the exact prompt and completion from your LLM, along with other important request metadata. Turn it on in VS Code settings -> Continue: Enable Console

🛠️ Improved tools

With a faster edit tool, ripgrep for search, and a new glob-based file search, agent mode will be able to navigate and modify your codebase much more efficiently


Agent mode, tabs, and local assistants

✨ Introducing agent mode

What used to be called "tool use" was promoted to the more prominent "agent mode" and given a lot more polish. When in agent mode, Continue can autonomously read and write files, run terminal commands, search the codebase or internet, and more.

🔖 Tabs

For those that enjoy many open browser tabs, this is for you. When tabs are enabled, starting a new chat will create a new tab, making it easier to access the previous.

💻 Local assistants

To speed up the feedback loop when building custom assistants, you can now define them locally in your workspace or the global ~/.continue folder. Read more in the docs.


Continue 1.0

Continue 1.0 introduces the ability to create multiple custom assistants using the ecosystem of models, rules, MCP servers, and more from hub.continue.dev. Read more about the launch here