I've been avoiding looking at a network profile of the RMM Dashboard, but stumbled on it tonight, and laughed to see why: Loading takes over _200_ separate HTTP requests, and _20MB_ of content download. I'm attaching screencast GIFs of this in both Chrome and Firefox Quantum (the new fastest browser). It stops when the page finishes rendering. No wonder first load is nearly 90 seconds. But it doesn't have to be this ...more »
It would be nice to see what type of RAM was in a device. This would aid us in purchasing new RAM should we need to as we can then see what sort to buy.
Hi LogicNow, We like to see additional options to the Synchronization status in de Backup Manager. What we can see now is that the Synchronization to LSV is at 33% and at the Remote Storage it is at 89%. We like to see some additional information like: - How many data needs to be processed, how many data is already processed and how many data is left to be processed - What is exactly processed, wich data and folders ...more »
Would like to be able to specifiy more that one SpeedVault location. Onsite SpeedVault is great for quick recovery of indiviual files or servers, however in the event of a disaster and using VDR in another location, restore times can be very long. If we could specifiy a second SpeedVault location at a disaster recovery site connected via site to site VPN, this could significantly reduce disaster recover times.
Please speed up the Software License Group search. Creating groups and transferring multiple items weighs on my productivity.
The Local Speed vault needs to have some reporting or alerting available in order to manage space, alert on failed writes to devices, etc. In many cases, this service will be used outside of RMM and the ability for the backup provider to discover hardware issues. So it is imperitive that local speed vaults be checked for consistency, compared against cloud storage and reported on separately from the online storage status. ...more »
Right now, it takes a LONG time to install the OS X agent, due to the post-installation scripts.
This is fine if deploying over Remote Desktop, but not if going from station to station.
There is no reason, as far as I can tell, that the asset-scan must be run during installation. Could it not be run by the agent in the background after?
Implement a process to perform backups of large amounts of data (500GB+) to a Local Vault which can be shipped to the data center so that subsequent backups (i.e. differential/incremental) can quickly be updated overnight. As it stands now a 500GB backup over ADSL will take 90 days. There has to be a faster way.
I would love to be able to monitor a ping time response for some sites, a few sites have problems with their internet connections and some using Citrix suffer from anomalies with their routers. I ping time check could alert us to degradation on the link or a problem with a router. Example: If ping time response is over 150ms for 5 consecutive pings, send alert. But allow the time and number of pings to be selected by ...more »