Larasou
2 months ago

More than 13 million data

Posted 2 months ago by Larasou

Hello people!

So here, I have a table that contains more than 13 million pieces of data and this is likely to increase over time.

When I'm doing some research or even doing any SQL query of the style:

App \ Property :: count ();

It takes about 10 seconds to get the result.

I would like to know if there is a way to optimize this? Can we do something at MYSQL level?

Here is the configuration of my PC:

➜  ✗ screenfetch

 ██████████████████  ████████     [email protected]
 ██████████████████  ████████     OS: Manjaro 18.0.4 Illyria
 ██████████████████  ████████     Kernel: x86_64 Linux 4.19.49-1-MANJARO
 ██████████████████  ████████     Uptime: 4h 56m
 ████████            ████████     Packages: 1261
 ████████  ████████  ████████     Shell: zsh 5.7.1
 ████████  ████████  ████████     Resolution: 3840x2160
 ████████  ████████  ████████     DE: Xfce4
 ████████  ████████  ████████     WM: Xfwm4
 ████████  ████████  ████████     WM Theme: Matcha-dark-sea
 ████████  ████████  ████████     GTK Theme: deepin [GTK2]
 ████████  ████████  ████████     Icon Theme: Vibrancy-NonMono-Dark-Blue-Vivid
 ████████  ████████  ████████     Font: Noto Sans 23
 ████████  ████████  ████████     CPU: AMD Ryzen 7 1700X Eight-Core @ 16x 3.4GHz
                                  GPU: GeForce GTX 1070
                                  RAM: 5991MiB / 32160MiB


Please sign in or create an account to participate in this conversation.