LM Studio Setup Guide Quick start instructions for running local AI models with Bwat using LM Studio.
🖥️ Configuring LM Studio with Bwat Run AI models locally on your machine and integrate them with Bwat.
✅ Requirements Windows/macOS/Linux system with AVX2 support
Bwat extension installed in VS Code
🛠️ Installation & Configuration
- Get LM Studio Download from lmstudio.ai
Complete installation for your OS
.png)
- Launch the Application Open LM Studio after installation
Interface contains four key sections:
Chat: Interactive interface
Developer: Server controls
My Models: Local model storage
Discover: Model marketplace
.png)
- Acquire a Model Browse the "Discover" section
Select and download your preferred AI model
Allow time for download completion (varies by size)

- Activate Local Server Navigate to "Developer" tab
Toggle server to "Running" position
Server runs at default address: http://localhost:1234 (opens in a new tab)

- Connect Bwat Launch VS Code
Open Bwat settings
Choose "LM Studio" as API provider
Select your downloaded model

📌 Important Reminders LM Studio must remain running during Bwat sessions
Initial model downloads may require significant time
All models are stored locally after download
Verify server status in Developer tab if issues occur
🚦 Troubleshooting Tips Connection failures:
Confirm LM Studio server is active
Verify model is properly loaded
Check system meets minimum requirements
Performance issues:
Try smaller models first
Close other resource-intensive applications
Ensure proper ventilation for thermal management