Can you give an example what the folder name would look like? And why would you want to be able to set it per flow card?
For example just the date (yyyymmdd) would be really nice. Maybe it’s even possible to make some kind of settings where it can be more freely configurable like: %y%m%d (for yyyymmdd) or %backupperiod (for last …) or even some free text for maximum freedom.
But in my case just the date would be great.
I added it to a feature request on Github so I won’t forget about it. Feel free to add detailed requirements:
Hi, since the last version on August 27 I see no new exports were written to my NAS. In the logs there is this error:
TypeError: Cannot read property ‘replace’ of undefined at Promise (/app.js:532:44) at at process._tickCallback (internal/process/next_tick.js:89.7)
Too bad it’s not working anymore, because I love this app
@PieterM . Thx for the compliment! can you please create an issue on github with details of the settings you use and the app log.
Hi @PieterM, I have just updated the app to version 2.5.0. I hope this solves the issue for you.
You can get it now on github, or wait for Athom to approve it in the appstore .
Version 2.5.0 was just released in the appstore. It has some bugfixes, and it allows you to do each export in a seperate folder. The folder is named after the date/time when the export started, so you can easily manage / remove old archives.
Let me know if you like it
I have setup the app and want to use sftp to export the logs. Now if I use the command line either ssh or sftp the address and folders all work fine. Now if I fill these in the settings en test them in the app I get an error. Looking at the logs, it states it is connected to myip:22 but this message is followed by a general Timeout (control socket). Anyone has any ideas what is going on?
edit: I also checked the logs on the server. It also confirms connections has been made, only that it is being dropped (no official disconnect)
edit 12-10: I have configured a Webdav server on the same machine and that one is working just fine. Still no idea what happened with the sFTP. But for me I am happy to be able to use the app now.
I managed to download the information on my Synology but I am disapointed to have a zip file for every app and inside each zip a .csv for each measure.
Would it be possible to have one big CSV ?
Would it be possbile to have only the CSV files without being into a zip file ?
Named with the date and year of extraction.
Mmh. I have too look into that. The zipping is important because during archiving, the data is first stored on Homey itself (and Homey has limited memory/storage), and also because exporting non-compressed data takes a very long time over WiFi.
Why would you want it unzipped?
I want to create a Qlik Dashboard based on this data. Unfortunately Qlik can’t read ‘inside’ a zip file.
If we do a weekly extract (max) it doesn"t take so much space.
@Gruijter Great app!
It would be nice to make it possible to select one (or a few) Logics so it will not make an export from all of them. Do you think that is possible?
Thx 4 the compliment
You can add your request as an issue on GitHub so I don’t forget to take a look at it on the next maintenance work I do. But to be honest I think it is too much work to implement it in a user-friendly way.
Thank you for this app! I have a problem exporting data though. Connection (WebDAV) works fine (it creates a
However, when I run the flow there are no files created.
I’ve tried having the flow use both
Export all insights as well as
Export insights of: for various brands. I do have output in Homey’s Insights, so there is data to export.
The log says only:
Exporting all insights last24Hours
Unless, of course, I’m only exporting specific brands. Then there are no log messages.
I have waited sufficiently long time for the export and upload to complete (>30 min), but still no output.
I cannot try any other protocol, because of server limitations.
Restarting Homey fixed the issue, now I get all the data and the log looks good. Thanks!
Feature request: Have failure to upload to server be a trigger for a flow. That way I can get a notification when the server is down and I can troubleshoot the issue immediately.
Good to hear you got it working now! Concerning the ‘trigger flow on error’; I will give it a thought.
I am not able to confige this app to work.
When i use SMB connect it says
“Error: STATUS_LOGON_FAILURE (0xC000006D) : The attempted logon is invalid. This is either due to a bad username or authentication information.”
But connection to this share on my Samba Server on Ubuntu 18.04 from another system with the sam credentials works fine.
Hey! I had some troubles last year with fibaro export. I wrote about it but didn’t really help you very much with the bug.
I just wanted you to know that it works perfectly now. Not sure when it started to work, but it does.