Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions src/seclab_taskflows/mcp_servers/local_file_viewer.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,6 +106,8 @@ async def fetch_file_content(
if not source_path or not source_path.exists():
return f"Invalid {owner} and {repo}. Check that the input is correct or try to fetch the repo from gh first."
lines = get_file(source_path, path)
if len(lines) > 1000:
Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The magic number 1000 should be extracted as a named constant at the module level. This makes the limit configurable and self-documenting. Consider defining something like MAX_FILE_LINES = 1000 at the top of the file with the other constants.

Copilot uses AI. Check for mistakes.
return f"File {path} in {owner}/{repo} is too large to display ({len(lines)} lines). Please fetch specific lines using get_file_lines tool."
Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The size check happens after get_file has already loaded all lines into memory (line 108). For very large files, this could still cause memory issues or performance problems before the check is performed. Consider checking the file size or line count before fully loading the content, or use a streaming approach that stops reading once the limit is exceeded.

See below for a potential fix:

    # Stream file lines to avoid loading large files into memory
    lines = []
    found = False
    filename = strip_leading_dash(path)
    with zipfile.ZipFile(source_path) as z:
        for entry in z.infolist():
            if entry.is_dir():
                continue
            if remove_root_dir(entry.filename) == filename:
                found = True
                with z.open(entry, 'r') as f:
                    for i, line in enumerate(f):
                        if i >= 1000:
                            return f"File {path} in {owner}/{repo} is too large to display ({i} lines). Please fetch specific lines using get_file_lines tool."
                        lines.append(f"{i+1}: {line.rstrip()}")
                break
    if not found:
        return f"Unable to find file {path} in {owner}/{repo}"

Copilot uses AI. Check for mistakes.
if not lines:
return f"Unable to find file {path} in {owner}/{repo}"
for i in range(len(lines)):
Expand Down