Easiest way to export insight data to local NAS storage

Since the insight data is reduced in numbers over time to save strorage size I want every day to send my data for my very accurate 3-phase current measurement (L1, L2, L3) to my NAS drive (Synology). I have tried the App, Export Insights without success, either I do not configure it correctly nor it does support v2.0.

Any Ideas to share?

Write tot syslog?

1 Like

If I dump the complete syslog as i understand your proposal is, how do I filter out specific data items from the insight data?

I guess there is no silver bullet. The question is how many time are you willing to put into it to get the results you want.

I solved this using a docker image running Node-Red, influxDB and Grafana. From Homey I trigger on device changes and send the data to Node-Red. For this I have a bunch of flows in place.

Maybe a solution with MQTT Hub would have been easier but at the time there was no MQTT Hub yet.

Anyways returning back to Node-Red, it sorts out the data and stores the data in influxDB.
While this seems like overkill and a lot of work the payoff is immense. Total freedom to vizualize your data as you see fit. Below you can see my dashboard to vizualize the data. You can even visualize binary data (on/off, true/false) and if you want you can store your data forever :smile: Awesome!!!


You can even visualize cpu load and memory usage of your Homey device and apps.

This simple node-red flow is the workhorse feeding the data to influxDB.

Below the “parse event data” function block containing javascript

newmsg = {}
newmsg.payload = {}
var local_time =  new Date().toLocaleTimeString();
newmsg.measurement = msg.payload.event.trim();

switch(newmsg.measurement) {
  case "temperature":
  case "luminance":
  case "humidity":
  case "battery":
  case "current":
  case "power":
  case "power_meter":
    newmsg.payload.value = parseFloat(msg.payload.value.trim());
    break;
  case "contact":
  case "motion":
  case "generic":
  case "onoff":
    newmsg.payload.value = (msg.payload.value.trim() == 'true'?1:0);
    break;
  default:
    newmsg.payload.value = msg.payload.value.trim();
}

newmsg.payload = [ newmsg.payload, { "name": msg.payload.name.trim()} ];
newmsg.payload[1].zone = msg.payload.hasOwnProperty('zone')?msg.payload.zone.trim():'unknown';
newmsg.payload[1].class = msg.payload.class.trim();

node.status({text:msg.payload.name.trim() + ' ' + newmsg.measurement +  ' ' + msg.payload.value.trim()  + ' (' + local_time + ')' });

return newmsg;

I’d be happy to share my node-red flows if you are interested.

1 Like

Hi Magnus, I have the app ‘export insights’ on my todo list to update it to the new Homey SDK and make it compatible with Homey V2. This will take some weeks though, since I’m doing it in my spare time :sunglasses:

3 Likes

@Marlon - Really impressed, thank you for sharing. However is this a bit over my head to time consuming to set up. Locking for a more simpler solution to just dump data on a shared disk. I think i wait for the updated Export Insight release.

1 Like

@Gruijter - Looking forward to the update of Export Insight app. Hope my use-case of save specific, selectable insight data to local disc will be possible.

Hi @Magnus_P and others,

I trying to decode how the new Insights in Homey V2 work. The data setup has significantly changed from Homey V1, so I’m definitely not able to make the export the same as the present Export Insights app. I’m re-thinking how to do the export from a flow, but I would need to know how you would want to use the data.

I have the following in mind which works similar to what the present Export Insights does, but with some crucial differences!

  • You can make an action flowcard for All apps, or select a particular app to export
  • In the flowcard you can select the period (e.g.: last24Hrs, last31Days, last2Years).
  • Depending on the selected period you get a different resolution of the data. A short period gives the highest resolution.
  • The export will create a zipped folder per app (e.g. appName.zip)
  • In the zipfile, each app related device will have a subfolder with the name of the device
  • In the devicefolder each insights log will have it’s own .csv file with the name of the log
  • In the same folder as the .csv a .meta file will be placed with information of the .csv data (e.g. from-to date, resolution and the internal Homey device-ID)

Now my questions:

  • If you make e.g. a daily export, the zipfile will be overwritten with the new data. Thereby losing the old data. Is this what you want? Or should I make a seperate zipfile for each export (e.g. appName_exportDate_exportTime.zip)? In the second case you need to remove the old files manually.
  • Do you want zipfiles, or rather have flat folder/file export (without zipping)?
  • should I use human readable device names, or the unique Homey device ID? When using human readable, the name of the device can change over time, thus the filename will also change then.
  • Does it make sense to group the export per app, or would you rather have the export per individual device/log? In the second case you get a very (very) long list to select from in the action card, but it is doable.
  • Many other things do decide which I haven’t even thought about :slight_smile:

Anyhow, would be great if you all give feedback on your use case for the export.:kissing_heart:

Hi,
Great you are looking into the possibilities, even though there is a big change in data structure in v2. I list some of the christmas gifts I would like to have: :yum:

  • For a specfic device I would like to select a specific capability (if many) to export. Example is my energy device that has (Power, Current, Voltage). Normaly atleast not me are intrested to export all data from all devices for an app.

  • I would like to have the possibility to decide to append the data to a defined file or create a new file.

  • The filename should be possible to be namned by the user and have the possibility to add unique datetime

That was I come up with now, will feeback you with more ideas when they come:-)

Did anyone try this? https://www.influxdata.com/integration/mqtt-monitoring/

I would like the following configuration:

  • date in file name
  • zip files
  • human readable device names
  • group per app

Thx for the feedback! I will try to fit in some of your wishes.

  • the export wil contain all capabilities/logs of a device. I will not make this selectable.
  • I wil start by grouping per app. I might add export per individual device later.
  • Appending to an existing file is too difficult to do right now. First of all appending doesnt work for network drives as far as I can read from the library im using. Secondly, how to handle appended data that has an overlap with existing data, or a gap, or even has a different resolution? So I will stick to making new zip files each export, and include a date_time in the filename.

Thx! I think this sticks most to the existing app export.

It is ready on Github! Who wants to give it a try?
https://github.com/gruijter/com.gruijter.insights2csv/tree/beta

Very interested, would you mind sharing the node-red flows and the homey setup (flows)?

thanks!

I use the Prometheus app for Homey (it exposes the current state of all devices). Then I run Prometheus in docker on a computer I have to pull the state from Homey. Prometheus has an option to write these values straight to newer Influx which supports the Prometheus protocol as well.

My prometheus.yml config file to acheive this. 192.168.10.126 is the IP of the Homey, 192.168.10.10 is the IP of the machine running Influx. You’ll have to create the Influx DB called “prometheus” first (create database …).

global:

scrape_configs:
  - job_name: 'homey'
    scrape_interval: 15s

    # metrics_path defaults to '/metrics'
    # scheme defaults to 'http'.

    static_configs:
     - targets: [
         '192.168.10.126:9414'
    ]

remote_write:
  - url: "http://192.168.10.10:8086/api/v1/prom/write?db=prometheus"
# Remote read configuration (for InfluxDB only at the moment).
remote_read:
  - url: "http://192.168.10.10:8086/api/v1/prom/read?db=prometheus"
3 Likes