UX Feedback: Mobile UI Layout Optimization
Problem: The current mobile user interface allocates significant screen real estate to input fields, model selectors, and action buttons. This leaves limited vertical space for viewing conversation history and AI-generated responses, especially when the on-screen keyboard is active. This can necessitate frequent scrolling to review content. Proposed Solution: Optimize the mobile UI layout to maximize the visible conversation area. Potential approaches include: Dynamic Hiding/Minimization: Automatically hide or minimize non-essential UI elements (e.g., model selection dropdowns, action buttons, bottom navigation bar) when they are not immediately needed, such as when the user is reading through conversation history or when the keyboard is active and focus is on the message input field. Collapsible Sections: Allow users to manually collapse or expand certain UI sections (e.g., "Files" or model selection areas) to free up space when desired. Contextual Display: Only display certain UI elements when they are relevant to the current user action (e.g., show action buttons clearly when the input field is active, but minimize them otherwise). Benefit: Improved user experience through enhanced readability of conversation content, reduced need for vertical scrolling, and a less cluttered interface on mobile screens.
somnian 6 days ago
Feature Request
UX Feedback: Mobile UI Layout Optimization
Problem: The current mobile user interface allocates significant screen real estate to input fields, model selectors, and action buttons. This leaves limited vertical space for viewing conversation history and AI-generated responses, especially when the on-screen keyboard is active. This can necessitate frequent scrolling to review content. Proposed Solution: Optimize the mobile UI layout to maximize the visible conversation area. Potential approaches include: Dynamic Hiding/Minimization: Automatically hide or minimize non-essential UI elements (e.g., model selection dropdowns, action buttons, bottom navigation bar) when they are not immediately needed, such as when the user is reading through conversation history or when the keyboard is active and focus is on the message input field. Collapsible Sections: Allow users to manually collapse or expand certain UI sections (e.g., "Files" or model selection areas) to free up space when desired. Contextual Display: Only display certain UI elements when they are relevant to the current user action (e.g., show action buttons clearly when the input field is active, but minimize them otherwise). Benefit: Improved user experience through enhanced readability of conversation content, reduced need for vertical scrolling, and a less cluttered interface on mobile screens.
somnian 6 days ago
Feature Request
Add .bases support
I want to be able to tell SystemSculpt to create a .base file (dataview/datacore would be nice too).
Devin Lewis 7 days ago
Feature Request
Add .bases support
I want to be able to tell SystemSculpt to create a .base file (dataview/datacore would be nice too).
Devin Lewis 7 days ago
Feature Request
UX Feedback: Streamlining Message Resubmission and Editing with Contextual Warnings
Issue: Counter-intuitive Message Resubmission and Editing Flow Summary of Current Problems "Resubmit" button: Requires an unexpected extra click (Resubmit?) which only populates the input field, forcing a manual submit. "Edit" button: Updates the history but does not re-evaluate the message, leading to confusion about its purpose. Proposed Solution: Unified "Edit/Resubmit" Workflow with Contextual Warnings Concept Combine "Resubmit" and "Edit" into a single, clearer flow. If resubmitting an older message, a warning appears, requiring confirmation before proceeding, as this will truncate subsequent chat history. New Workflow Steps Initial Action: User clicks a single, unified button (e.g., "Edit/Resubmit Message" or a combined icon) on a past user message. Modal Appearance: A popup window (modal) appears, pre-populated with the selected message's content. The user can edit the message within this modal. User Choice within Modal: The modal presents two distinct action buttons: a. "Update History Only" (or "Save Edit"): Action: Updates the message in the chat history without sending it to the LLM. Purpose: For historical corrections only. b. "Send to LLM" (or "Resubmit Edited"): Action: Sends the current content of the modal's text field to the LLM as a new message. Purpose: To get a new LLM response based on the (potentially edited) user input. Conditional Warning: If the selected message is not the most recent user message: A prominent warning appears: "Sending this message will remove all subsequent messages from this point forward in the chat history. Do you wish to proceed?" The "Send to LLM" button is disabled until the user explicitly confirms this warning (e.g., via a checkbox or a separate confirmation button). Benefits of Proposed Solution Clarity: Explicit user intent for editing vs. resubmitting. Efficiency: Streamlined resubmission process. Data Integrity: Prevents accidental chat history truncation by requiring explicit confirmation for non-linear resubmissions. Transparency: Clearly communicates consequences of actions. Environment Obsidian: v1.8.10 System Sculpt: v1.2.1 OS: Windows, iOS
somnian 9 days ago
Feature Request
UX Feedback: Streamlining Message Resubmission and Editing with Contextual Warnings
Issue: Counter-intuitive Message Resubmission and Editing Flow Summary of Current Problems "Resubmit" button: Requires an unexpected extra click (Resubmit?) which only populates the input field, forcing a manual submit. "Edit" button: Updates the history but does not re-evaluate the message, leading to confusion about its purpose. Proposed Solution: Unified "Edit/Resubmit" Workflow with Contextual Warnings Concept Combine "Resubmit" and "Edit" into a single, clearer flow. If resubmitting an older message, a warning appears, requiring confirmation before proceeding, as this will truncate subsequent chat history. New Workflow Steps Initial Action: User clicks a single, unified button (e.g., "Edit/Resubmit Message" or a combined icon) on a past user message. Modal Appearance: A popup window (modal) appears, pre-populated with the selected message's content. The user can edit the message within this modal. User Choice within Modal: The modal presents two distinct action buttons: a. "Update History Only" (or "Save Edit"): Action: Updates the message in the chat history without sending it to the LLM. Purpose: For historical corrections only. b. "Send to LLM" (or "Resubmit Edited"): Action: Sends the current content of the modal's text field to the LLM as a new message. Purpose: To get a new LLM response based on the (potentially edited) user input. Conditional Warning: If the selected message is not the most recent user message: A prominent warning appears: "Sending this message will remove all subsequent messages from this point forward in the chat history. Do you wish to proceed?" The "Send to LLM" button is disabled until the user explicitly confirms this warning (e.g., via a checkbox or a separate confirmation button). Benefits of Proposed Solution Clarity: Explicit user intent for editing vs. resubmitting. Efficiency: Streamlined resubmission process. Data Integrity: Prevents accidental chat history truncation by requiring explicit confirmation for non-linear resubmissions. Transparency: Clearly communicates consequences of actions. Environment Obsidian: v1.8.10 System Sculpt: v1.2.1 OS: Windows, iOS
somnian 9 days ago
Feature Request
Copy Button Corrupts Markdown Formatting
Bug: When using the "Copy" button on LLM responses: Copied content loses all Markdown formatting (bold, lists, code blocks, etc.) Excessive/unexpected line breaks are added to the output Manual copy/paste preserves formatting correctly (issue is specific to the plugin's copy function) Reproduction Steps Generate an LLM response containing Markdown elements (e.g., - list item, **bold text**, blockquote, code fences) Click the response's "Copy" button Paste content into any Markdown editor (Obsidian, VS Code, etc.) Observe: All formatting stripped (plaintext only) Added line breaks (e.g., single list becomes multiple paragraphs) Expected Behavior Copied content should preserve original Markdown syntax identically Line breaks and structure should match source output exactly Environment Obsidian: v1.8.10 System Sculpt: v1.2.1 OS: Confirmed on Windows and iOS Technical Notes Manual text selection + copy/paste works correctly β confirms OS clipboard functions properly Issue likely in plugin's copy-handling logic (e.g., extracting textContent instead of raw Markdown) Particularly problematic on iOS where text selection is challenging
somnian 9 days ago
π Bug Reports
Copy Button Corrupts Markdown Formatting
Bug: When using the "Copy" button on LLM responses: Copied content loses all Markdown formatting (bold, lists, code blocks, etc.) Excessive/unexpected line breaks are added to the output Manual copy/paste preserves formatting correctly (issue is specific to the plugin's copy function) Reproduction Steps Generate an LLM response containing Markdown elements (e.g., - list item, **bold text**, blockquote, code fences) Click the response's "Copy" button Paste content into any Markdown editor (Obsidian, VS Code, etc.) Observe: All formatting stripped (plaintext only) Added line breaks (e.g., single list becomes multiple paragraphs) Expected Behavior Copied content should preserve original Markdown syntax identically Line breaks and structure should match source output exactly Environment Obsidian: v1.8.10 System Sculpt: v1.2.1 OS: Confirmed on Windows and iOS Technical Notes Manual text selection + copy/paste works correctly β confirms OS clipboard functions properly Issue likely in plugin's copy-handling logic (e.g., extracting textContent instead of raw Markdown) Particularly problematic on iOS where text selection is challenging
somnian 9 days ago
π Bug Reports
Missing LLM Responses with Unavailable Models Description
BUG: When reopening chat histories where the original LLM model is unavailable: Only user messages appear in the chat interface. LLM responses vanish without any placeholder, error message, or indication they ever existed. Reproduction Steps Execute command: Open SystemSculpt Chat History. Select a chat entry created with a model no longer available. Observe: Popup notification: Essentially stating "Model isn't available." Chat window displays only user messages. Zero traces of LLM responses. Expected Behavior Fallback display of LLM responses (e.g., as plaintext). Possibly a visual indicator that responses are missing (e.g., warning icon listing original model used). Environment Obsidian: v1.8.10 System Sculpt: v1.2.1 OS: Windows, iOS Critical Questions for Developer Are LLM responses still stored in the note's source Markdown? Can responses degrade to plaintext when models are unavailable? Can a persistent warning (e.g., banner) explain why responses are missing?
somnian 9 days ago
π Bug Reports
Missing LLM Responses with Unavailable Models Description
BUG: When reopening chat histories where the original LLM model is unavailable: Only user messages appear in the chat interface. LLM responses vanish without any placeholder, error message, or indication they ever existed. Reproduction Steps Execute command: Open SystemSculpt Chat History. Select a chat entry created with a model no longer available. Observe: Popup notification: Essentially stating "Model isn't available." Chat window displays only user messages. Zero traces of LLM responses. Expected Behavior Fallback display of LLM responses (e.g., as plaintext). Possibly a visual indicator that responses are missing (e.g., warning icon listing original model used). Environment Obsidian: v1.8.10 System Sculpt: v1.2.1 OS: Windows, iOS Critical Questions for Developer Are LLM responses still stored in the note's source Markdown? Can responses degrade to plaintext when models are unavailable? Can a persistent warning (e.g., banner) explain why responses are missing?
somnian 9 days ago
π Bug Reports
Perplexity Endpoint Error
Perplexity endpoint is not not recognized as custom model. Will not connect. Probabbly because the endpoint is only https://api.perplexity.ai/ and the system atomatically points to https://api.perplexity.ai/v1/models
TomSpoct 14 days ago
π Bug Reports
Perplexity Endpoint Error
Perplexity endpoint is not not recognized as custom model. Will not connect. Probabbly because the endpoint is only https://api.perplexity.ai/ and the system atomatically points to https://api.perplexity.ai/v1/models
TomSpoct 14 days ago
π Bug Reports
Chat with web transcripts (and other file types)
Iβve been doing a lot of zoom meetings that have transcription turned on and have a library of .vtt files attached to obsidian notes. I realized that I canβt chat with them natively because I canβt pull them into context with this dialog: I would be nice to have vttexplicitly added and, in general an option for βanyβ (or maybe deselecting all the buttons give access to all the files in the vault. Thanks,
Matt Stone 14 days ago
Feature Request
Chat with web transcripts (and other file types)
Iβve been doing a lot of zoom meetings that have transcription turned on and have a library of .vtt files attached to obsidian notes. I realized that I canβt chat with them natively because I canβt pull them into context with this dialog: I would be nice to have vttexplicitly added and, in general an option for βanyβ (or maybe deselecting all the buttons give access to all the files in the vault. Thanks,
Matt Stone 14 days ago
Feature Request
Easier Chat Navigation by Leveraging Obsidian Sidebar Outline View
Scrolling back and forth through long conversations to find previous messages or AI responses can really take a while. My suggestion is to make navigation easier by leveraging the space and concept of the Obsidian Outline sidebar. Imagine if that right sidebar (the one you can toggle open) could display a simple list representing the conversation turns β maybe just labeling them something like "Your Message 1", "AI Response 1", "Your Message 2", "AI Response 2", and so on. The key feature would be that clicking on one of those items in the sidebar would instantly jump the main chat window right to that specific message or response. This feels like it could make reviewing past parts of a conversation much, much faster and smoother. And since Obsidian already has an outline sidebar, perhaps there's a way to hook into that functionality (or maybe it would need to be a custom implementation just for SystemSculpt). Either way, being able to quickly jump to any point in the chat from the sidebar would be a really helpful improvement for managing longer conversations. Thanks for considering it!
somnian 19 days ago
Feature Request
Easier Chat Navigation by Leveraging Obsidian Sidebar Outline View
Scrolling back and forth through long conversations to find previous messages or AI responses can really take a while. My suggestion is to make navigation easier by leveraging the space and concept of the Obsidian Outline sidebar. Imagine if that right sidebar (the one you can toggle open) could display a simple list representing the conversation turns β maybe just labeling them something like "Your Message 1", "AI Response 1", "Your Message 2", "AI Response 2", and so on. The key feature would be that clicking on one of those items in the sidebar would instantly jump the main chat window right to that specific message or response. This feels like it could make reviewing past parts of a conversation much, much faster and smoother. And since Obsidian already has an outline sidebar, perhaps there's a way to hook into that functionality (or maybe it would need to be a custom implementation just for SystemSculpt). Either way, being able to quickly jump to any point in the chat from the sidebar would be a really helpful improvement for managing longer conversations. Thanks for considering it!
somnian 19 days ago
Feature Request
AI renaming of PDFS
I have a lot of academic PDFβs and they all have random-ish file names I would like system sculpt to give me the option to have AI rename the .pdf file in my vault based on the article name or other extracted text thanks!
Oblique82 22 days ago
Feature Request
AI renaming of PDFS
I have a lot of academic PDFβs and they all have random-ish file names I would like system sculpt to give me the option to have AI rename the .pdf file in my vault based on the article name or other extracted text thanks!
Oblique82 22 days ago
Feature Request
iOS Issue: No Settings Exit from Chat Window Settings
After opening settings from a chat window, thereβs no visible way to exit/close the settings panel except by force-quitting Obsidian. Expected Behavior: "Back" button, "X" icon, or swipe gesture dismisses settings. Observed Behavior: User trapped in settings with no intuitive exit path. Reproduction Steps: Open settings from chat window. Attempt to return to chat: No visible exit option. Suggested Fixes: Add a persistent "Close" button. Enable iOS swipe-to-dismiss gesture.
somnian 23 days ago
π Bug Reports
iOS Issue: No Settings Exit from Chat Window Settings
After opening settings from a chat window, thereβs no visible way to exit/close the settings panel except by force-quitting Obsidian. Expected Behavior: "Back" button, "X" icon, or swipe gesture dismisses settings. Observed Behavior: User trapped in settings with no intuitive exit path. Reproduction Steps: Open settings from chat window. Attempt to return to chat: No visible exit option. Suggested Fixes: Add a persistent "Close" button. Enable iOS swipe-to-dismiss gesture.
somnian 23 days ago
π Bug Reports
iOS Issue: Model Selection Buttons Non-Functional
When tapping βChange Modelβ buttons in SystemSculpt, the button visually depresses (appears clicked) but no model selection menu appears. Expected Behavior: Tap button β menu pops up with available models. Observed Behavior: Button feedback (visual press) occurs, but no menu is displayed. Reproduction Steps: Open a chat in SystemSculpt. Tap any βChange Modelβ button. Observe: No menu appears.
somnian 23 days ago
π Bug Reports
iOS Issue: Model Selection Buttons Non-Functional
When tapping βChange Modelβ buttons in SystemSculpt, the button visually depresses (appears clicked) but no model selection menu appears. Expected Behavior: Tap button β menu pops up with available models. Observed Behavior: Button feedback (visual press) occurs, but no menu is displayed. Reproduction Steps: Open a chat in SystemSculpt. Tap any βChange Modelβ button. Observe: No menu appears.
somnian 23 days ago
π Bug Reports
Completed
Checkout Problems
Apologies if this forum is only for System Sculpt itself, butβ¦.I would consider a different commerce provider. I am trying to give you money right now, and I keep getting Object Object errors.
rcbcarm987 27 days ago
Feature Request
Completed
Checkout Problems
Apologies if this forum is only for System Sculpt itself, butβ¦.I would consider a different commerce provider. I am trying to give you money right now, and I keep getting Object Object errors.
rcbcarm987 27 days ago
Feature Request
Toggle Option: Invert Enter / Ctrl+Enter Behavior
Request: Add an option to switch the behavior of Enter and Ctrl+Enter in the message input field. Currently, Enter submits the message and Ctrl+Enter inserts a newline. Requesting a toggle to invert this behavior: Enter for newline, Ctrl+Enter to submit. This is supported in interfaces like: Slack Discord Notion Microsoft Teams Google Chat The feature is useful for users who draft longer responses and prefer not to submit prematurely.
somnian 27 days ago
Feature Request
Toggle Option: Invert Enter / Ctrl+Enter Behavior
Request: Add an option to switch the behavior of Enter and Ctrl+Enter in the message input field. Currently, Enter submits the message and Ctrl+Enter inserts a newline. Requesting a toggle to invert this behavior: Enter for newline, Ctrl+Enter to submit. This is supported in interfaces like: Slack Discord Notion Microsoft Teams Google Chat The feature is useful for users who draft longer responses and prefer not to submit prematurely.
somnian 27 days ago
Feature Request
Transcripts
can I give some quick feedback? when you create an md file directly from a conversation, and there is a source document like an audio file, it would be great if the callout that contains the full transcript was collapsed by default. Otherwise its looooong dump of data to scroll through. I know I can edit it manually to make it collpase though - just a thought!
Oblique82 27 days ago
Feature Request
Transcripts
can I give some quick feedback? when you create an md file directly from a conversation, and there is a source document like an audio file, it would be great if the callout that contains the full transcript was collapsed by default. Otherwise its looooong dump of data to scroll through. I know I can edit it manually to make it collpase though - just a thought!
Oblique82 27 days ago
Feature Request
iOS issues
I'm a new user and I've just installed this plug-in into a fresh vault. I'm getting a lot of errors on iOS that don't seem to happen on the normal Mac version when I'm trying to change AI models with the AI simply stops working after the first prompt.
Oblique82 27 days ago
π Bug Reports
iOS issues
I'm a new user and I've just installed this plug-in into a fresh vault. I'm getting a lot of errors on iOS that don't seem to happen on the normal Mac version when I'm trying to change AI models with the AI simply stops working after the first prompt.
Oblique82 27 days ago
π Bug Reports
Using Ollama for Embedding
Having had no success using LMStudio , either for embedding or chat, I thought I would try with Ollama. for embedding I looked at info at https://ollama.com/library/nomic-embed-text and I pulled that model. As per the web page I set the link to "http://localhost:11434/api/embeddings" but there is nowhere to enter the model name. It just returns errors. My connection to Ollama for Chat works fine.
David Torrens About 1 month ago
π Bug Reports
Using Ollama for Embedding
Having had no success using LMStudio , either for embedding or chat, I thought I would try with Ollama. for embedding I looked at info at https://ollama.com/library/nomic-embed-text and I pulled that model. As per the web page I set the link to "http://localhost:11434/api/embeddings" but there is nowhere to enter the model name. It just returns errors. My connection to Ollama for Chat works fine.
David Torrens About 1 month ago
π Bug Reports
Change directory for Saved Chats
Add Saved Chats to list of Directories to allow user to change if they wish to do so.
John Nguyen About 1 month ago
Feature Request
Change directory for Saved Chats
Add Saved Chats to list of Directories to allow user to change if they wish to do so.
John Nguyen About 1 month ago
Feature Request
FileServer MCP
Implement a FileServer MCP so models can create .md files. This comes from being spoiled by Cline/Claude Code.
John Nguyen About 1 month ago
Feature Request
FileServer MCP
Implement a FileServer MCP so models can create .md files. This comes from being spoiled by Cline/Claude Code.
John Nguyen About 1 month ago
Feature Request