duyn's avatar
Level 7

How to implement OpenAI streaming in a Laravel Inertia app?

I know the fantastic openai-php package supports OpenAI's streaming APIs (e.g., https://github.com/openai-php/client?tab=readme-ov-file#create-streamed-2) but I have not had any luck figuring out how to process the streaming response on the client-side in Vue. Anyone have any ideas?

0 likes
2 replies
LaryAI's avatar
Level 58

To implement OpenAI streaming in a Laravel Inertia app, you'll need to handle the streaming response from the server and update the client-side Vue component in real-time. Here's a step-by-step guide to achieve this:

Server-Side (Laravel)

  1. Install the OpenAI PHP Client:

    First, ensure you have the OpenAI PHP client installed in your Laravel project:

    composer require openai-php/client
    
  2. Set Up the Streaming Endpoint:

    Create a controller method to handle the OpenAI streaming request. This method will stream data to the client.

    use OpenAI\Client;
    
    public function streamOpenAIResponse()
    {
        $client = new Client('your-openai-api-key');
    
        return response()->stream(function () use ($client) {
            $client->createStreamed([
                'model' => 'text-davinci-003',
                'prompt' => 'Your prompt here',
                'max_tokens' => 100,
            ], function ($chunk) {
                echo "data: " . json_encode($chunk) . "\n\n";
                ob_flush();
                flush();
            });
        }, 200, [
            'Content-Type' => 'text/event-stream',
            'Cache-Control' => 'no-cache',
            'Connection' => 'keep-alive',
        ]);
    }
    

    This method uses Server-Sent Events (SSE) to stream data to the client.

  3. Define the Route:

    Add a route to your web.php file to access this controller method.

    Route::get('/stream-openai', [YourController::class, 'streamOpenAIResponse']);
    

Client-Side (Vue with Inertia)

  1. Set Up the Vue Component:

    In your Vue component, use the EventSource API to listen to the server-sent events.

    <template>
      <div>
        <h1>OpenAI Streaming Response</h1>
        <pre>{{ response }}</pre>
      </div>
    </template>
    
    <script>
    export default {
      data() {
        return {
          response: '',
        };
      },
      mounted() {
        const eventSource = new EventSource('/stream-openai');
    
        eventSource.onmessage = (event) => {
          const data = JSON.parse(event.data);
          this.response += data.choices[0].text;
        };
    
        eventSource.onerror = () => {
          console.error('Error receiving stream');
          eventSource.close();
        };
      },
    };
    </script>
    

    This component listens for messages from the server and appends the streamed text to the response data property.

  2. Handle the Stream:

    The onmessage event handler processes each chunk of data received from the server and updates the Vue component's state.

Summary

This setup allows you to stream data from OpenAI's API to your Laravel server and then to your Vue client using Server-Sent Events. The key is to ensure that the server streams data correctly and the client listens and updates the UI in real-time.

otezz's avatar

Hi @duyn. Did you figure out how to handle this? I'm facing a similar problem with prism when trying to stream a response to an Inertia component.

Please or to participate in this conversation.