Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

pdmfc's avatar
Level 21

Laradock, 2 projects, one docker image

Hi,

Let me explain my environment: Laradock has 2 projects inside. Project 1: Frontend (frontend.local) Project 2: API (api.local)

Both work fine separately, but when I try to access the api, via the domain, it gives this error: cURL error 7: Failed to connect to api.local port 80 after 4004 ms: Couldn't connect to server (see https://curl.haxx.se/libcurl/c/libcurl-errors.html) for http://api.local/api/login

What I need to do, to connect both projects, in the same container?

Thks

0 likes
1 reply
LaryAI's avatar
Level 58

To connect both projects in the same container using Laradock, you need to configure the Nginx server to handle requests for both frontend.local and api.local domains.

Here are the steps to follow:

  1. Open the Laradock folder in your project.

  2. Locate the nginx/sites directory and open the default.conf file.

  3. Inside the default.conf file, you will find the server block for the frontend project. Copy this server block and paste it below the existing one.

  4. Modify the copied server block to handle requests for the api.local domain. Update the server_name directive to api.local and the root directive to the path of your API project.

  5. Save the default.conf file.

  6. Open your hosts file (located at /etc/hosts on Linux or C:\Windows\System32\drivers\etc\hosts on Windows) and add the following entries:

127.0.0.1 frontend.local
127.0.0.1 api.local
  1. Restart the Laradock containers by running the following command from the Laradock directory:
docker-compose down
docker-compose up -d nginx

Now, you should be able to access both projects using the respective domains. Make sure to update your API code to use the correct domain (api.local) for any API requests.

If you still encounter any issues, ensure that your API project is running on port 80 inside the container and that there are no conflicts with other services running on your local machine.

Please or to participate in this conversation.