Skip to content

FIX: Thought Chain UI Improvements#45

Merged
wicky-zipstack merged 13 commits intomainfrom
fix/thought-chain-ui-improvements
Apr 7, 2026
Merged

FIX: Thought Chain UI Improvements#45
wicky-zipstack merged 13 commits intomainfrom
fix/thought-chain-ui-improvements

Conversation

@tahierhussain
Copy link
Copy Markdown
Contributor

@tahierhussain tahierhussain commented Apr 7, 2026

What

  • Updated initial thought chain message from "Preparing database information..." to "Gathering context..."
  • Consolidated EnhancedPromptInfo and PromptInfo into a single PromptInfo component
  • Made thought chain collapsible (collapsed by default, showing latest message with step count)
  • Added React performance optimizations (memoization, removed unused props)
  • Fixed duplicate loader issue when collapse is expanded during streaming

Why

  • The previous message "Preparing database information..." was too specific and didn't accurately describe the initial processing step
  • Having two separate components (EnhancedPromptInfo and PromptInfo) added unnecessary complexity
  • Collapsible thought chain improves UX by reducing visual clutter while still allowing users to see full history
  • React optimizations prevent unnecessary re-renders and improve performance

How

  • Backend: Changed the content string in llm_context.py
  • Frontend:
    • Renamed EnhancedPromptInfo.jsx to PromptInfo.jsx and removed legacy component
    • Added Ant Design Collapse component for collapsible UI
    • Removed unused props (isThoughtChainReceived, llmModel, coderLlmModel, chatIntent)
    • Added useMemo for processedChain and errorDetails objects
    • Added condition to hide header loader when expanded (isInProgress && !isExpanded)

Can this PR break any existing features. If yes, please list possible items. If no, please explain why. (PS: Admins do not merge the PR without this section filled)

  • No, this PR should not break existing features
  • The thought chain functionality remains the same, only the UI presentation has changed
  • All props that were removed were already unused
  • The component API is backward compatible

Database Migrations

  • None

Env Config

  • None

Relevant Docs

  • None

Related Issues or PRs

  • None

Dependencies Versions

  • No new dependencies added

Notes on Testing

  • Test thought chain display during prompt execution (streaming)
  • Verify collapsible behavior: collapsed by default, shows "(N previous steps)" count
  • Verify expanded view shows all messages in chronological order
  • Verify loader only appears in header when collapsed, and in body when expanded
  • Test error scenarios to ensure error popovers still work correctly

Screenshots

Collapsed

Screenshot from 2026-04-07 14-01-28

Expanded

image

Checklist

I have read and understood the Contribution Guidelines.

tahierhussain and others added 4 commits April 7, 2026 14:04
Change "Preparing database information..." to "Gathering context..."
for a more accurate description of the initial processing step.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
- Remove EnhancedPromptInfo.jsx (renamed to PromptInfo.jsx as the only implementation)
- Remove unused props: isThoughtChainReceived, llmModel, coderLlmModel, chatIntent
- Remove unused memoized values from Conversation.jsx
- Memoize processedChain with useMemo to prevent re-computation
- Memoize errorDetails object to prevent unnecessary re-renders

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
Add CSS for collapsible thought chain component:
- Collapsed by default, shows latest message with step count
- Expand to see all messages in chronological order
- Proper padding and styling for collapsed/expanded states

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
@tahierhussain tahierhussain self-assigned this Apr 7, 2026
@tahierhussain tahierhussain added the enhancement New feature or request label Apr 7, 2026
@greptile-apps
Copy link
Copy Markdown

greptile-apps bot commented Apr 7, 2026

Greptile Summary

This PR consolidates EnhancedPromptInfo and PromptInfo into a single PromptInfo component, introduces a collapsible thought-chain UI using Ant Design's Collapse, and updates the initial backend message to "Gathering context...". All five issues flagged in previous review rounds have been resolved:

  • The "Processing..." shimmer now correctly returns null for completed conversations (shouldStream guard on line 566).
  • The "(0 previous steps)" label is hidden when previousCount === 0.
  • The errorDetails variable shadow was renamed to msgErrorDetails.
  • React keys now use the message's display text instead of [object Object].
  • The overly-broad disapproval-reason fallback regex was removed — only the explicit [DISAPPROVE REASON] prefix is matched.

Changes by file:

  • llm_context.py: Correct, minimal string change.
  • Conversation.jsx: llmModels, EnhancedPromptInfo, and several unused useMemo values removed; errorDetailsMemo added for a stable object reference passed to PromptInfo.
  • PromptInfo.jsx: Thought chain is now collapsible (collapsed by default), the header spinner is correctly suppressed when expanded, and the component returns null for historical conversations with no active stream.
  • ThoughtChainEnhancements.css: Clean new utility classes for the collapsible layout.
  • EnhancedPromptInfo.jsx: Deleted — no longer needed.
  • Minor nit: The err parameter in the copyErrorToClipboard catch block is declared but never used (see inline comment).

Confidence Score: 5/5

Safe to merge — all previous P1/P2 issues are resolved and no new blocking issues were found.

All five issues identified in prior review rounds have been addressed. The only remaining finding is a cosmetic P2 lint nit (unused catch variable). The component consolidation is clean, removed props were confirmed unused, and the collapsible UX change is backward-compatible.

No files require special attention; PromptInfo.jsx is the most heavily changed file but its logic is sound.

Important Files Changed

Filename Overview
backend/backend/application/context/llm_context.py Straightforward string change from 'Preparing database information...' to 'Gathering context...'
frontend/src/ide/chat-ai/Conversation.jsx Removed EnhancedPromptInfo import, unused llmModels prop, and computed values; added errorDetailsMemo for a stable prop reference
frontend/src/ide/chat-ai/EnhancedPromptInfo.jsx File deleted — functionality consolidated into PromptInfo.jsx
frontend/src/ide/chat-ai/ExistingChat.jsx No longer passes llmModels to Conversation; all other Conversation props remain unchanged
frontend/src/ide/chat-ai/PromptInfo.jsx Merged EnhancedPromptInfo into PromptInfo; added collapsible thought chain via Ant Design Collapse; all five previous review issues resolved; minor unused catch variable remains
frontend/src/ide/chat-ai/ThoughtChainEnhancements.css Added CSS for collapsible thought chain: .thought-chain-collapse, .thought-chain-collapse-header, and related layout classes

Flowchart

%%{init: {'theme': 'neutral'}}%%
flowchart TD
    A["Conversation.jsx\nreceives message prop"] --> B["Derives errorDetailsMemo\n(useMemo)"]
    B --> C["PromptInfo\nthoughtChain / shouldStream / errorDetails"]
    C --> D{thoughtChain\nnot empty?}
    D -- Yes --> E["processedChain useMemo\npairs attempt + DISAPPROVE REASON"]
    E --> F["renderThoughtChain()"]
    F --> G["Ant Design Collapse\ncollapsed by default"]
    G -- Collapsed --> H["Header: latest message\n+ N previous steps count"]
    G -- Expanded --> I["Body: full timeline\nrenderMessage() for each step"]
    D -- No --> J{shouldStream?}
    J -- Yes --> K["Fallback shimmer\n'Processing...'"]
    J -- No --> L["return null\n(completed conversation)"]
Loading
Prompt To Fix All With AI
This is a comment left during a code review.
Path: frontend/src/ide/chat-ai/PromptInfo.jsx
Line: 140-148

Comment:
**Unused `err` parameter in catch block**

The caught error is declared but never used — neither logged nor surfaced in the notification description. This will trigger a `no-unused-vars` ESLint warning and silently discards the failure reason, making clipboard issues harder to debug.

```suggestion
      .catch(() => {
        notification.error({
          message: "Copy failed",
          description: "Could not copy to clipboard",
          placement: "topRight",
          duration: 2,
        });
      });
```

How can I resolve this? If you propose a fix, please make it concise.

Reviews (6): Last reviewed commit: "fix: remove stale llmModels prop passed ..." | Re-trigger Greptile

tahierhussain and others added 4 commits April 7, 2026 14:18
Prevent showing indefinite "Processing..." shimmer for completed
conversations that have no thought chain data.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
Only show "(N previous steps)" label when previousCount > 0 to avoid
displaying misleading "(0 previous steps)" for single-message chains.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
Rename local variable in parseMessage from errorDetails to msgErrorDetails
to prevent shadowing the component-level errorDetails prop.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
Extract display text from object messages for the key instead of
getting "[object Object]" from string coercion.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
Copy link
Copy Markdown
Contributor

@abhizipstack abhizipstack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Copy Markdown
Contributor

@wicky-zipstack wicky-zipstack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM — clean consolidation, all prior review issues resolved. Two minor P2s for follow-up: unused llmModels required prop in Conversation.jsx, and latest message duplication when Collapse is expanded.

tahierhussain and others added 3 commits April 7, 2026 15:22
Prevent duplicate display of latest message by hiding the header
content when expanded, since all messages are shown in the body.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
Override Ant Design's default 8px border-radius on .ant-collapse-content
to match the parent container's 12px border-radius.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
The llmModels prop was no longer used after removing the
llmModelDisplayName and coderLlmModelDisplayName memoized values.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
@tahierhussain
Copy link
Copy Markdown
Contributor Author

tahierhussain commented Apr 7, 2026

LGTM — clean consolidation, all prior review issues resolved. Two minor P2s for follow-up: unused llmModels required prop in Conversation.jsx, and latest message duplication when Collapse is expanded.

@wicky-zipstack Thanks for pointing it out. Both P2 items have been addressed as well.

tahierhussain and others added 2 commits April 7, 2026 16:10
The fallback regex /^[^:]+:\s*.+/ matched any string with a colon,
which could incorrectly consume legitimate messages like
"Creating: model.sql". Keep only the explicit [DISAPPROVE REASON]
prefix check.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
The Conversation component no longer uses llmModels, so stop
passing it from ExistingChat.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
@wicky-zipstack wicky-zipstack merged commit 1c1daa2 into main Apr 7, 2026
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants