I've inherited a project that's sort of a mish-mash of legacy code, multiple developers, and half-implemented pieces of Laravel (for example, they've brought in Eloquent but are only using the query builder). I'm tasked with implementing unit testing. It's actually been surprisingly straightforward so far, but now I'm coming up against the fact that there's no database seeding and inconsistently implemented migrations - the local database is a raw SQL dump of the production server. For obvious reasons, I would prefer to spin up a new database for testing.
The way I most frequently do this on my own projects is just to use database seeding with an in-memory SQLite database, but a) converting MySQL to SQLite isn't fun, and b) a 9GB SQLite database would be a bit much.
My best bet for database testing would, I believe, be to duplicate the local MySQL database and point to the duplicate when testing. However, it's still not a perfect solution. Due to the database size, a normal SQL dump takes ~1.5 hours to run, which makes resetting the database after running a full test suite a difficult proposition.
So I suppose what I'm looking for is either,
a) a foolproof way to rollback a series of database changes, or b) a much, much (probably impossibly) faster way to duplicate a ~9GB MySQL database
I have a feeling neither of these exist which would be unfortunate. The long-term plan is to implement database seeding but that's a long ways off.
Any database aficionados know of some black magic that would help me out here?