Re: Loop on mysql output with bash

From: Lew Pitcher <lew.pitcher_at_digitalfreehold.ca>
Date: Mon, 12 Mar 2018 14:01:57 -0400
Message-ID: <p86fan$2i8$1_at_dont-email.me>


[Quoted] Boubaker Idir wrote:

> Hello Pitcher thank you for your answer.
> I have a database with the name "Mydatabase" with a table wifi, the table
> wifi has those elements: id, packetid, bssid,rssi. multiple id have the
> same packetid, and different bssid and rssi. I want to do a select from my
> database and to put it in a json file like this:
>
> {"wlan": [{"mac": "78:54:2E:EC:75:96", "powrx": -52},{"mac":
> {"16:2D:27:96:08:36", "powrx": -56},{"mac": "F8:D1:11:4C:28:EC", "powrx":
> {-73},{"mac": "00:14:6C:41:37:8E", "powrx": -76},{"mac":
> {"00:23:04:5C:73:9A", "powrx": -77},{"mac": "00:23:04:5C:73:90", "powrx":
> {-79},{"mac": "00:23:04:5C:60:40", "powrx": -81}]}
>
> I want this output in a json file for each packetid,

[Quoted] When there are rows that all have the same packetid, what do you want your json file to contain? Would it contain

a) a line for each row that has that packetid, or
b) a single line aggregating all the rows that have that packetid, or
c) a single line summarizing the first row that has that packetid, or
d) a single line summarizing the last row that has that packetid, or
e) something else (please provide details)?

> the same result each
> time to send it via curl to an API, so I must have a loop for each
> packetid and each time the results goes to the json file.

[Quoted] Do you want a unique file for each unique packetid? Or would one file for all suffice?

> Thank you I hope
> I explained better

If you will accept
[Quoted] a) a line of JSON for each row, and
b) a single file, then
[Quoted]   mysql -BN -u root -pMyPassword -s -N >text2.txt <<QUERY_INPUT   use myDatabase;
  SELECT CONCAT("{""wlan"": [",GROUP_CONCAT(JSON_OBJECT("mac",bssid,"powrx",rssi)), "]}")   FROM wifi
  ORDER BY packetid
  QUERY_INPUT
might be enough.

-- 
Lew Pitcher
"In Skills, We Trust"
PGP public key available upon request
Received on Mon Mar 12 2018 - 19:01:57 CET

Original text of this message