Skip to main content
What Is Mouse?

Mouse vs. Built-In File Editing

Head-to-head comparison across 8 real-world editing scenarios

How They Compare

Built-in file-editing tools rely on string replacement. Mouse uses precision operations that target exactly what needs to change. For the underlying problem this solves, see Why Mouse Exists.

Below are 8 real-world editing scenarios that developers encounter regularly. For each, we show what happens with built-in tools and what happens with Mouse.


Scenario 1: Checklists

The task: Mark tasks 1.2, 1.4, and 1.5 as complete in a 20-line Markdown checklist: toggle - [ ] to - [x] without modifying any task descriptions.

Built-in tools: The agent rewrites the entire checklist (or every line through task 1.5) via string replacement just to toggle three checkboxes. The developer reviews 5+ rewritten lines instead of 3 checkboxes, and frequently discovers formatting errors or content drift in lines the agent wasn't supposed to touch.

Mouse: The agent uses a columnar edit to target the 3 specific lines and insert x at the checkbox column. Only the 3 checkbox characters change. Surrounding lines are never read, echoed, or touched.


Scenario 2: Filling Out Forms and Outlines

The task: A developer creates a plan document as an outline, then asks the agent to populate individual sections out of order: body sections first, Introduction and Conclusion last.

Built-in tools: Inserting content into a specific section requires replacing the header and surrounding content to anchor the string replacement. Headers get rewritten, blank lines shift, and populating section 4 before section 2 risks corrupting earlier placeholders. The agent may give up and rewrite the entire document.

Mouse: The agent inserts content after a specific line number, placing new text precisely between existing headers without touching the headers themselves. Sections can be populated in any order. When using batch editing, all insertions reference the original file state, so there is no line drift between operations. Critically, the agent never needs to touch the structure of the file to fill out a form or insert a section in a document outline, ensuring that the file's structural integrity remains intact through multiple rounds of edits.


Scenario 3: Table Column Operations

The task: Insert a new value in the Nth column of an ASCII table, or move a column of data from one position to another in a CSV file.

Built-in tools: Built-in tools have no concept of columns. The agent rewrites the entire table (or file) to insert or move columnar data. For even moderately sized files, this hits context window limits, wastes massive tokens, and introduces formatting errors in rows the agent wasn't supposed to modify. For column moves in larger files, the operation is typically impossible.

Mouse: Columnar edits target specific column positions across a range of lines, inserting, replacing, or deleting at precise offsets without touching content outside that range. For column relocation, Mouse can move an entire rectangular region of content with zero calculation required by the agent.


Scenario 4: Commenting Out Legacy Code

The task: Comment out 40 lines of legacy code by prefixing each with // , without modifying the code content or surroundings.

Built-in tools: The agent reads all 40 lines, copies them in full, then replaces them with the same 40 lines prefixed, doubling the token cost. Any whitespace mismatch causes the string replacement to fail silently or match the wrong location. The developer must verify every line was commented correctly and that no content was altered.

Mouse: A columnar edit inserts // at column 0 of each line in the range. The agent never reads or echoes the line content. Token cost is constant regardless of how many lines are being commented.


Scenario 5: Cross-File Import Updates

The task: Rename an exported function across multiple files: update the export in the source file and every import that references it across 6 other files.

Built-in tools: The agent makes 7+ separate string replacement calls sequentially, one per file. Each requires reading the file, finding the exact import line, echoing it, and replacing it. If any single replacement fails, the codebase is left in a partially updated state. Some files have the new name while others still have the old one.

Mouse: Batch editing stages all 7 replacements in a single call, each targeting its own file. All changes are staged together for review. Each file's changes are atomic: if any operation within a file fails, none of that file's edits are applied. The assistant reviews the staged result across all files before saving.


Scenario 6: Inserting Tests

The task: Insert a new suite of 10 unit tests at a specific location in a long test file, between two existing test suites.

Built-in tools: The agent must find an anchor string near the insertion point, then replace the anchor with "anchor + new tests." This rewrites content around the insertion point and risks deleting or corrupting the end of the preceding suite or the beginning of the next one. The developer reviews not just the new tests but every surrounding line.

Mouse: An insert operation places the new test suite after a specific line number. Surrounding code is never read, echoed, or touched. The developer reviews only the new tests.


Scenario 7: Mid-Line Tailwind Insertions

The task: Add responsive breakpoint classes (md:w-1/2 lg:w-1/3) to className strings across 12 components, inserting at the correct position within each existing class list.

Built-in tools: The agent reads each component, finds the className string, echoes the entire class list (20+ utility classes), and replaces it with the same list plus new classes. Across 12 files, 12 long class strings are echoed and rewritten. Any class reordering or accidental deletion creates a subtle styling bug.

Mouse: Character-level precision inserts only the new classes at the exact position within each className string, without echoing or rewriting surrounding classes. With batch editing, all 12 insertions execute atomically across files.


Scenario 8: Clean Deletion

The task: Delete 60 lines consisting of 3 deprecated functions from a source file, leaving no blank-line artifacts or formatting damage.

Built-in tools: The agent reads all 60 lines, echoes them back verbatim as the search parameter, and replaces them with nothing. String replacement frequently mishandles whitespace and line endings, leaving blank lines or formatting artifacts. If even one character in the 60-line echo doesn't match (a trailing space, a line ending), the replacement fails entirely.

Mouse: A delete operation removes the exact line range. The agent specifies only two numbers (start and end), regardless of whether the deletion is 6 lines or 600. Mouse auto-normalizes line endings and removes internal line breaks properly. Token cost is constant and trivial.