Adding the following lines to the robots.txt file should prevent Google from indexing the /api/ path:
User-agent: *
Disallow: /api/
However, if the API is still being indexed, there may be other issues at play. One possible solution is to add a meta tag to the HTML header of the API pages with the noindex attribute. This will instruct search engines not to index the page.
<meta name="robots" content="noindex">
Another solution is to add authentication to the API endpoints, so that only authorized users can access them. This can be done using Laravel's built-in authentication system or a third-party package like Laravel Passport.
It's also a good idea to regularly check the Google Search Console for any indexing issues and to submit a sitemap to Google to ensure that only the desired pages are being indexed.