Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

morito's avatar

Laravel 10 - Logging to Cloudwatch won't work

I started upgrading my Laravel implementation from 8 to 10. I got everything to work except for the logging. I used the package provided by maxbanton before and switched to phpnexus/cwh because my previous package didn't work with Laravel 10. Everything seemed to be working just fine, but my Logs keep getting lost somewhere between the Logger function and AWS. I don't see any issues regarding logger creation in the laravel log file, but AWS does not track any usage of the access key that I use in this application. To me it seems like the logs get fired by Laravel but don't get received by Cloud Watch. This is my configuration:

in config/logging.php:

  'cloudwatch' => [
    'driver' => 'custom',
    'via' => \App\Logging\CloudWatchLoggerFactory::class,
    'stream' => 'app',
    'sdk' => [
      'region' => env('AWS_DEFAULT_REGION', 'eu-central-1'),
      'version' => 'latest',
      'credentials' => [
        'key' => env('AWS_CLOUDWATCH_KEY_ID'),
        'secret' => env('AWS_CLOUDWATCH_SECRET_ACCESS_KEY')
      ]
    ],
    'retention' => env('LOG_RETENTION', null),
    'level' => env('LOG_LEVEL','error'),
  ],

In App\Logging\CloudWatchLoggerFactory.php:

<?php

namespace App\Logging;

use Aws\CloudWatchLogs\CloudWatchLogsClient;
use Monolog\Logger;
use Monolog\Formatter\JsonFormatter;
use PhpNexus\Cwh\Handler\CloudWatch;

class CloudWatchLoggerFactory
{
    /**
     * Create a custom Monolog instance.
     *
     * @param  array  $config
     * @return \Monolog\Logger
     */
    public function __invoke(array $config)
    {

        $sdkParams = $config["sdk"];
        $tags = $config["tags"] ?? [ ];
        $name = $config["name"] ?? 'cloudwatch';

        // Instantiate AWS SDK CloudWatch Logs Client
        $client = new CloudWatchLogsClient($sdkParams);

        // Log group name, will be created if none
        $groupName = config('app.name') . '-' . config('app.env');

        // Log stream name, will be created if none
        $streamName = $config['stream'];

        // Days to keep logs, 14 by default. Set to `null` to allow indefinite retention.
        $retentionDays = $config["retention"];

        // Instantiate handler (tags are optional)
        $handler = new CloudWatch($client, $groupName, $streamName, $retentionDays, 10000, $tags);

        // Optionally set the JsonFormatter to be able to access your log messages in a structured way
        $handler->setFormatter(new JsonFormatter());

        // Create a log channel
        $logger = new Logger($name);

        // Set handler
        $logger->pushHandler($handler);

        return $logger;
    }
}

My Composer Imports:

    "php": "^8.0",
    "aws/aws-sdk-php": "3.295.0",
    "doctrine/dbal": "^3.0",
    "firebase/php-jwt": "^5.4",
    "guzzlehttp/guzzle": "^7.0.1",
    "laravel/framework": "^10.0",
    "laravel/sanctum": "^3.2",
    "laravel/tinker": "^2.5",
    "league/flysystem-aws-s3-v3": "^3.0",
    "phpnexus/cwh": "^3.1",
    "stripe/stripe-php": "^7.89"

To test this I used tinker

php artisan tinker
\Log::channel('cloudwatch')->debug('test from Laravel App');

The above commands yield the following response:

	= null

And thats all I get, nothing gets written to either the Cloudwatch Logs (it doesn't even create the Log Group or the stream) or the laravel log file.

I have triple checked all environment variables for correctness. For debugging purposes, I removed all safeguards from my AWS IAM Policy that is attached to the user account whos access keys my app is using. So any blocking on the AWS side seems to be out of the question too. Any help would be greatly appreciated.

0 likes
1 reply
morito's avatar
morito
OP
Best Answer
Level 1

Update: Alright, it seems like I wasn't able to see the forest through the trees. The answer is so stupidly simple that its kind of embarassing, but maybe it helps someone else. In my case, the batch size was set to 10'000. I didn't catch that sooner because I thought that no matter the batch size the Log Group and stream would be created, but I was wrong. Operations in AWS only commence when the batch is full. Setting the batch size to 1 fixed the issue for me.

6 likes

Please or to participate in this conversation.