2 prompts
Implement streaming LLM responses with Server-Sent Events. Token-by-token output from API to browser with proper error handling and abort.
A intermediate-level code prompt for creating websocket chat server. Fill in the variables to customize the output for your specific needs.