I’ve been constantly updating zynthian-amp2.local (It’s the big piano keyboard and 7" screen machine and it doesn’t seem to offer all those nice lv2 plugins . . .
How can I gently remind it to include this stuff which I’ve picked up cleanly on other zynths?
> return method(self, *args, **kwargs)
> File "/zynthian/zynthian-webconf/lib/system_backup_handler.py", line 138, in post
> }[action]()
> File "/zynthian/zynthian-webconf/lib/system_backup_handler.py", line 129, in <lambda>
> 'DATA_BACKUP': lambda: self.do_data_backup(),
> File "/zynthian/zynthian-webconf/lib/system_backup_handler.py", line 162, in do_data_backup
> self.do_backup('zynthian_data_backup', SystemBackupHandler.DATA_BACKUP_ITEMS_FILE)
> File "/zynthian/zynthian-webconf/lib/system_backup_handler.py", line 180, in do_backup
> zf.close()
> File "/usr/lib/python3.4/zipfile.py", line 1534, in close
> self.fp.write(centdir)
> MemoryError
> ERROR:tornado.access:500 POST /api/sys-backup (192.168.1.8) 22867.39ms
> INFO:root:Client disconnected
There is definitely room for improvement.
But you well never get 100% because of timeouts.
Or you do it completely different.
Websocket or the like.
Feel free to do it.
I can live with partial backups.
I am not using backup and restore for sf2 anyways. They don’t change and I just upload them the normal way…which is limited as well.
Everywhere!! and that’s just the rest of the planet. . .
I think it’s a non problem but it’s good to reflect on the process. As you say if you have valuable resource, like the mighty mellotrons in my case, you should look after it somewhere else than on your zynth!. . And I’ve copied them en-block over ssh, with the current file by file mechanism you wouldn’t load too many that you didn’t specifically need. So it’s almost a self limiting issue.