[This post is part of a series of articles about working from home in Shotgun.]
This topic aims to tackle how you could upload and download published files so that they can be shared with your remote based team.
This post won’t include the actual code needed to upload and download them from a cloud storage provider as each provider has it’s own API’s, but it will provide a basic framework as a starting point and show you where you would implement such changes in the Toolkit code.
The idea of this example is that when a user publishes files, it will publish them, as usual, creating a
PublishedFile entity in Shotgun, but also upload the files to cloud storage provider such as Google Drive, OneDrive, or Dropbox.
Then when another user wants to use those publishes, they would use the loader app to download them.
Before we start a couple of things to note are:
- Often cloud storage providers have a folder sync feature where a folder on your local drive is continually monitored and synced to the remote storage. Whilst this can work for some people, it can also lead to corruption of data, and so I would advise against using this. Our suggested approach is more of a push pull as required method and no files would be overwritten.
- Your Shotgun site is not a good place to store large files such as Maya scene files, we don’t offer that service, you should use another provider for your remote file storage.
Please note that the code provided in this example is not tested in production and is not guaranteed to work, it is intended as an example that can be built upon.
tk-framework-remotestorageexamplerepo to your config. You could use a git descriptor and point directly at our repo, but I would suggest forking it and distributing it your self, as we are not going to maintain/support it (it’s an example!), and if we do update it, we won’t necessarily try to maintain backwards compatibility.
A Github release descriptor might work well for this purpose.
Add the example
tk-multi-publish2hook to your config. This hook will run at the end of your publish and will use the framework to upload the files to the remote storage. Copy the hook to your config’s hook folder and update the publish settings to use it. In my screen grab, I’ve just implemented it in Maya, but you should set it on all environments you wish to use it on.
Copy the example
tk-maya_loader.pyhook over to your config and set the Maya loader settings to use it. This hook will then download the files when the user chooses to reference or import a scene. The example makes use of hook inheritance to allow it to only implement the required changes and leave the rest to the base hook. Also, the logic here can, with a little bit of work, easily be applied to other software loader hooks, not just Maya.
tk-framework-remotestorageexamplehook over to your config, and configure the framework to use it. Note this example hook does not upload a file to any remote storage, but instead copies it to a folder called
$HOMEdirectory. It is here as a proof of concept and you would want to modify it to use your cloud storage provider’s API for uploading and downloading files.
Now it should be ready to use. Publishing will cause the
PublishedFiles to be copied to the mock folder in a flat structure prefixed with the
id. Loading will copy the files back to the location they were originally published to.
Let us know if you spot any issues or have any suggestions!
It should also be noted that this code doesn’t take into account dependencies.
The best thing to do would be to make sure that everything is published (and therefore uploaded) and that you are tracking the dependencies/connections between the published scene files and everything it depends on in Shotgun. You could then update the framework’s hook to check in Shotgun if the
PublishedFile has any dependencies and then also download those as well (if they didn’t exist locally).