Shotgun Create slowness

Hello.

When Create was announced, I was looking forward to it. In theory it represents our workflows much better than Desktop.

However, this was at least months ago, probably an year already and we still cannot use Create.

The “Open in…” dialogue takes minutes to open. Sometimes it never opens. Is this normal?

I imagine it has something to do with how our pipeline configuration is setup. But Desktop, although a bit slow, it’s nothing close.

We use an hybrid setup in which the default apps are downloaded on the spot and custom configurations are hosted in our internal network, this setup is supported by Create, correct?

I also noticed that it only happens the first time you try to open a “type” (eg. a task or a video) of file, after that it’s fast.

This is unrelated to the topic of this particular thread, but I’m also curious, are all Desktop features supported by Create? If not, which ones are? Specifically about custom made apps, are they supported by create? Sorry if there’s a documentation for this, I wasn’t able to find it.

Thanks.

1 Like

Welcome back vtrvtr!

The way Shotgun Create shows actions that can be launched is similar to how the browser integration loads and caches actions in the web app. While Shotgun Desktop only needs to load the actions for a given project, Shotgun Create needs to be able to load any actions for any entity at any moment, which means the loading and caching strategies are much different.

To do this, it needs to first load the correct environment file and retrieve the commands for the requested entity. If you have multiple pipeline configurations for the same project, each of these need to be loaded and cached. This is exactly like launching a Toolkit action from the webapp, which can feel slow whether Shotgun Desktop or Shotgun Create is serving that request.

In general, the more Toolkit needs to pull files from the network, whether it’s the Toolkit configuration files or python source code, the slower the action’s loading and caching will be. This is why for optimal performance we recommend people use a distributed configuration by uploading a copy of their configuration to Shotgun and/or use a git descriptor. This way the pipeline configuration and all custom apps are cached locally on the user’s computer. This speeds up the caching process and the future loads.

You’ve also asked:

I also noticed that it only happens the first time you try to open a “type” (eg. a task or a video) of file, after that it’s fast.

Do you mean that this is slow the very first time you ever try to open a task and then subsequent instances of Shotgun Create are fast, or do you mean within the same Shotgun Create session? Toolkit provides a basic level of caching between process launches. Create adds another layer of caching from within the same process. It would be great to know of which caching level feels slow and which feels fast.

This is unrelated to the topic of this particular thread, but I’m also curious, are all Desktop features supported by Create? If not, which ones are?

Shotgun Create uses Toolkit under the hood to enumerate actions and execute them, so both support the same Toolkit features. However, Create does not have a project centric view where apps run in a shared process. In Create, each Toolkit application is launched individually in its own process.

We hope this clarifies how Toolkit integrates with both applications and the guidance we’ve provided will help speed up your use of Toolkit.

Do not hesitate to reach back if you have any other questions!

JF

1 Like

Hello, Jean, thanks for the answer.

I mean inside the same section. Between sections it’s take quite a while to get the apps back again

I’ll make some tests with the completely distributed config.

1 Like

Hi!
Thanks for the clarification. One thing I didn’t make clear in my original post is how the caching system works and how network speed can be an issue. Basically, when you request a configuration and there is a cache hit, we then look at the files inside your environment to see if they have changed. If they haven’t, we do not reload it and use the cached result. If disk IO is slow, it’s possible that verifying that the cache is not stale might take several moments, which gets worse as you add more pipeline configurations to your project.
JF

1 Like

Hello. I’m having some time to test this more, what’s the explanation for this behavior:

As you can see there are many configuration ‘caches’. If you look at the time they are very close. That’s because every time I close Create and open again, it redownloads most of of the configuration again, what gives?

Also, is there a way to reload a pipeline configuration inside Create? In Desktop you could “exit” and enter project to force a reload, is there something equivalent?

EDIT:
Not only that, but it seems it redownloads files for configurations that the current user doesn’t even have access to.

2 Likes

Hi @vtrvtr!

Thank you for sending along this screenshot of your cache folders. It would be helpful to know a bit more about the Toolkit configurations you’re using for your projects. For one, are you using any path or dev descriptors for your configs? With shotgun descriptors, our mechanisms can scan the config to ensure that the local cache is up to date, but with path/dev descriptors, we recache the config every time we regenerate the browser integration menus in Shotgun.

We also noticed something here that might be relevant: in the names of the folders, the p... number indicates a project id, and the c... number indicates a pipeline configuration id. We noticed that your projects tend to have many configurations associated with them. When you have multiple configs associated with a project, each one is processed by the browser integration code, so this can slow things down considerably.

So, you may want to test out a project with only a single configuration associated with it, and compare the performance with what you’re currently seeing.

cheers,
tannaz

1 Like

Hello, Tannaz.

Yeah, all descriptors are of path or dev type.

Currently we have 12 pipe. configs. but only one of them is used in production. 11 of them are heavily limited in terms of users. Is that not enough? Do I actually have to delete configurations in order to not have them slowing down? If yes, is there a way to just disable them instead of completely deleting?

It seems to me the only reasonable way to have it set up is to have a distributed config. I’ll try that.

Thank you.

1 Like

Just an update for this. Migrating all configs to a distributed-like setup indeed speeds it up massively.

Thanks for feedback, everyone.

2 Likes