JacobFHolland's avatar

SQLSTATE[HY000] [2006] MySQL server has gone away

Running into an error when I try and run a Job. It only happens occasionally while processing the job (usually when the data being processed is large).

SQLSTATE[HY000]: General error: 2006 MySQL server has gone away

Through some googling I found this article: https://laravel-news.com/laravel-and-mysql-8-fixing-mysql-server-has-gone-away-error

Which, seems geared to Linux users. Right now I am using XAMPP on Windows. So I opened my XAMPP user control panel, opened the Config file for MySQL. Added to my config file

[mysqld]
default_authentication_plugin=mysql_native_password

Restarted the Laravel server, XAMPP, MySQL. And I am still getting this error.

I know the Job functions properly with small amount of data (234 entries into the database through the job works fine, 9,000 entries does not). So I am assuming this is not a problem with the actual code inside the Job class I have. Previously while making a different job that mostly did the same thing (yay troubleshooting!) I never had received this error and the Job was able to process 40,000 entries.

I am assuming that I am getting this error because it is timing out while trying to create the payload for the job? Just not sure what's going on here.

0 likes
7 replies
bugsysha's avatar

That is exactly why I've switched to SQLite or Docker depending on the need for matching the production as closely as possible.

JacobFHolland's avatar

So would you recommend I just move to Redis or something? Is this just a limitation of MySQL? Just looking for a practical solution to this problem. I have never used Redis before but I am sure it can't be that hard to swap over.

bugsysha's avatar

Not Redis, but SQLite or MySQL under Docker. PostgreSQL is also a great option, but also under Docker.

JacobFHolland's avatar

So to confirm, I just moved to SQLite to test it. The job processes just fine. However it adds entries at like 1/1000th of the speed that it was adding entries to MySQL. Is it supposed to be slow? I have to enter 40,000 entries and that would've normally been done at (somewhat close to) 100 records per second on MySQL. SQlite is going roughly 10 records or less a second.

EDIT: Also if it is processing the Job TablePlus seems to repeatedly lose its connection to the database file while it is in mid-entry. This does not seem like a better way to go about it. I feel like I should be fixing the MySQL problem rather than creating new different problems.

bugsysha's avatar

For me SQLite works very well. It is not something that I would use on production, but for local env I had no issues. Do you have fast SSD? For me it is not slow at all.

I also use TablePlus and it never dropped connection with SQLite. Not sure what your workflow is but mine always worked out of the box.

JacobFHolland's avatar
JacobFHolland
OP
Best Answer
Level 3

Yeah I have a standard HDD which is probably the difference we are experiencing. For those of you who are googling and reading form the future (hello future people) I resolved this issue by modifying my MySQL config file.

I changed max_allowed_packet under [mysqld] to 10M though, depending on the size of the data you are passing who knows you may need more. 10mb in a single MySQL field is a lot of data though so if you are passing more than that into a single field maybe its time to re-evaluate what your process is

I did also add default_authentication_plugin=mysql_native_password to the same section as per the advice of the article linked above. Though, I don't think that actually resolved or changed anything. That was the first thing I tried and it didn't actually work until I changed the max_allowed_packet to be larger. I am leaving it in there since it doesn't seem to hurt anything and I am not curious to find out how everything behaves if I remove it. 99.99% sure the packet size was what fixed it.

EDIT: Just to further clarify so you aren't going "ok well that fixes it by why did it fix it". I believe what was happening was that my Job was pretty dang large. Not only was it a lengthy complex operation itself but the dataset it ultimately needed to process through was very large. MySQL could not physically enter all the information required into the payload field of the Jobs table because there was a limitation on the size of what could be stored in a MySQL field. Bumping that size up allowed me to store my Job properly in the Jobs table.

Please or to participate in this conversation.