Re: Store and updating large number of columns

From: The Natural Philosopher <tnp_at_invalid.invalid>
Date: Mon, 19 Mar 2018 18:47:16 +0000
Message-ID: <p8p0jl$m19$1_at_dont-email.me>


On 19/03/18 16:02, forough m wrote:
> Hi, I have many cells (10000 per hour) that each cell has 1000
> counters(Keys and Values that comes during hours and need to update).

Do you mean by this that you have 10,000 RECORDS every hour and there are 1000 key /value pairs for each record?

> I need to store cell name and counters, then calculate some formula
> on counters.

So you are looking at 10,000 rows per hour each one of which will be 1000 fields wide? Wow! a lot of data.

> But i don't know how to keep counters in MySql to having
> best update and select query performance.

The two are in opposition.

The moer indices you have the better (select...where field equals...) performance you will get, but the slower will be the insertion.

> Now i keep them in Josnb in
> Postgresql, and want to store them in MySql. MySql Json has not index
> feature and it make slowly query. what do you suggest me?
>

Throw anything to do with Json in the dustbin if you want efficiency.

I have gotten the greatest speed and efficiency with C coded access to mysql, and dont be clever with the SQL. Use the C language to do the clever stuff.

You can test the efficiency of queries using mysql from the command line, to see what impact indices have. And to proptotype what queries will perfom best.

Having found that PHP simply falls apart on large data objects, I now have PHP as a shim to display the website screens, which then passes control to a program to actually do te queries on the large data sets I have.

I use javascript as little as possible. It really sucks.

-- 
"I guess a rattlesnake ain't risponsible fer bein' a rattlesnake, but ah
puts mah heel on um jess the same if'n I catches him around mah chillun".
Received on Mon Mar 19 2018 - 19:47:16 CET

Original text of this message