ADR-66: Emotes System for Renderer (Unity)

More details about this document
Latest published version:
GitHub decentraland/adr (pull requests, new issue, open issues)
Edit this documentation:
GitHub View commits View commits on


NFT emotes are arriving to Decentraland with a whole new set of requirements and challenges. This documentation serves as a guide for an implementation that include the new features while following the principles and standards for the Renderer.

Problem Statement

Emotes are in the essence just wearables with some extra data. The same UX expected from wearables are now expected from emotes. They have representations, rarity, tags, you can sell or trade them...

Renderer now longer will contain all the emotes embedded, most of them will live in the ContentServer and this process must be invisible to the user.

Additionally due to memory limitations, the users can now equip up to ten emotes. Any emote not equipped cannot be played.




Tracking Emotes usage

Knowing which emotes are equipped is important to optimize the memory usage.

A new collection has been implemented to ease this task: RefCountedCollection track the amount of uses for a key. This allows prioritization of what emotes must be loaded and disposed.

Therefore, the DataStore contains an emotesOnUse : RefCountedCollection.

Emotes Animations Plugin

The GLTFs carrying the animation must be downloaded for the ones on use.

A plugin has been implemented to react to changes in emotesOnUse, download the GLTF, retrieve the animation and place it for usage.

Any animation already retrieved and ready will be collected in a Dictionary in the DataStore, allowing completely decoupling with the plugin itself for every Animation consumer.


*When unloading animations, the disposal of GLTF takes effect after the removal of the anim (so every system using the animation can react before it gets unloaded). For readability sakes in the diagram, it has been placed inversely.


Emotes are tied to the avatar itself. The ones equipped by a user will be retrieved along the other wearables in the profile and they must be treated specifically as emotes. Also, the bodyshape affects the representation used in the GLTF downloaded and due to the limitations in the legacy animation flow, animations must be prewarmed at an specific point in the avatar loading process.

The Avatar System is complex enough to be explained in its own ADR-65.

The usage of runtime animations forced the GLTFImporter implementation to rely in legacy animations. There’s lot of accesible bibliography on this matter but basically the old system uses an Animation component and the new one uses an Animator component based on Mechanim. There's an on-going research to evaluate if newer Unity’s versions allow the usage of runtime animations with an Animator but at the moment, the implementation is tied to the legacy one.


Most of the UI complexity is just Unity specific and falls over the ADR’s scope. The key requirement from the architecture is already constrained by the use of DataStore.

UI can request emotes to be loaded (in the backpack for example) with EmotesOnUse and known when an animation is ready by listening to changes in Animations.


The systems created are easily testable thanks to the architecture used. The Data Driven Design taking place in the DataStore allows the dependencies to be mocked entirely leaving the test suite as simple as possible.



Giving the spotlight to the data itself, instead of the systems, permit easy decoupling between all the actors working with emotes. It also eases the testing process because mocked data can be injected at any time in the flow.

EmotesAnimationPlugin also includes embedded emotes making the rest of the system completely agnostic to them. Embedded emotes allows on-going discussions such as “how to handle base-wearables in a decentralized way” to be meditated properly without becoming a blocker.


Downloading animations, controlling their lifecycle and disposing them is the heavy lifting of this system. There's a handful of valid approaches to this problem and few reasons to pick one over another; even a singleton (forbidden word) would solve it in a fairly clean way.

Since the plugin system is already consolidated in the renderer architecture, there's no need to stray apart from it. You can read about the benefits of using it here.


Copyright and related rights waived via CC0-1.0. Living