Introducing the novel hotspot creation system in Dagon for unity!


As I’ve been teasing on Twitter, we have a brand new hotspot creation system in Dagonity. This is a far cry from the system we had before, and dare I say it’s one of the most advanced workflows ever devised for adventure games. I’m working on a blog post that will thoroughly explain how it works and why it’s a big deal, but in short:

It’s as simple as assigning a color to any object in 3DS Max (the only software we support now, though it should be easy enough to extend this functionality to Maya or Blender). What our Max tool does is export a list of “colored” items along with their names and individual textures with their placement on screen. Then, the engine automatically assigns an interaction (be it feedback or a custom action) to these items, so the programmer never needs to know the color, just that there’s an item called “gloves” (for example). Similarly, the writer only needs to know that there are gloves in that scene and write a few lines for them. I’m working on a script that will convert an Excel sheet with all the feedbacks into Dagon’s new JSON format (about the same approach we did in Serena, except the feedbacks were Lua tables).

Why is this a big deal? As you know, items can be seen from different angles in a first-person adventure, so previously you may have needed to define several interactive regions for the same item. Worse, if you have a high interaction density per location (and it’s truly HIGH in Asylum), you could spend hours defining interactions for just a single room. Not anymore with this tool – it’s all as simple as writing the texts and exporting the interactive items from 3DS Max. Dagonity will take care of the rest :slight_smile:

More soon!


Wow, this is nice! I think it was cpage that used a color-based system similar to this but it wasn’t tied to the models like this. Hotspot creation is definitely a tedious task.

I hope the bad guys don’t get a hold of this because you have created the ultimate pixel-hunting engine! :stuck_out_tongue:

I may be speaking too soon and need more details about how it actually works, but it would seem to be a step backwards wrt open-sourcing the code. There are so many modeling programs out there and it sounds like every single one will require its own release, not to mention what happens when these programs evolve. I’d hate to see peeps with their favorite modelers being left in the cold. Any modification of this approach that makes it modeler-agnostic would be a real boon.


This is fantasic, Agustin! Makes it so much easier to set up the hotspots - the biggest pain was always trying to work out the coordinates and wrangling objects that crossed multiple faces.

From a Blender point of view, the colour separation images should be no problem - it’s quite straightforward to set up a Pass Index for each object ( and then assign colours in the built-in compositor. This could be done at the exact same time as rendering the faces where two images (rendered face and object colour-coded map) can be output at the same time.

What format does the corresponding JSON look like - can you give an example of what the Max script outputs? It shouldn’t be too difficult to put together a Blender python script to output the related object name, number and hex color in JSON for Dagonity to consume.


Pipmak did it by colour but it used a really funk palette and a nightmare to get right… it jut did hotspots by order of colour, so you scripted them in the palette order… ok if there was just a few ???


Sounds great Agustin! Can’t wait to try it.


I’ll post samples of the JSON files and the script produced by Max soon (this one is actually a very simple list).

There’s no need to be tied to a particular modeling tool, as you could even provide textures with colorized spots produced with Photoshop.


Agustin what’s with game view in the first screenshot? How did you do that?


That’s not in-game but a capture of 3ds Max when we were editing a scene :slight_smile:


Sorry, thought it was something based on the a color value WITHIN a model, or something…makes no sense now…where is my brain! (light bulb goes off: is this the name of my next game attempt? :slight_smile:
If Dagonity just uses a set of textures with color-coded regions then there is no issue, in fact it is very similar to programs like cpage. Each modeler can just design a special render pass to make them, not unlike what would to make alpha masks, etc.
Are we going to be using alpha masks? PNGs with alpha layer? A Z-depth texture AND a gray-scale alpha map would be so thuper!


In Vue, you can define multiple render passes and Vue also uses python scripts, so for the scenes not rendered already, this could possibly work. I’d also love to see examples of the JSON files and the script produced by 3DSMax. I’m assuming that the JSON files is a conversion of English.lua?

To get the size and location of spots and to cut patches, I use a variation of your Irfanview technique, but in PhotoPaint. Perhaps, one could “fake” the Max-generated file and at least get a partial benefit. :stuck_out_tongue:


The object mapping file is as simple as this:

{ "1":"Gloves", "2":"Watch", "3":"Corkboard", "4":"Notes", "5":"Shelf", "7":"Light switch", "9":"Boxes", "11":"Trash container", "13":"Radio", "15":"Blanket", "17":"Chair", "19":"Bags", "0":"Door" }

As Cleo suggests, the new JSON format would work as a replacement for the current You won’t have to know JSON to code Dagon scripts, though. That process remains the same: you use Lua to define the interactive objects and their feedbacks. JSON is strictly used for language files and data storage, not scripting.

It’s all darned simple and efficient, especially for localization efforts. I’m going to share more samples soon, we’re currently wrapping up a demo of Asylum :slight_smile:

As for goodies: yes, we’re supporting alpha masks, Z-depth and normal maps for innovative lighting effects. All super-awesome :nod:

We're currently wrapping up a demo of Asylum :)
If I understand correctly, sounds more than awesome. :)


upcoming teaser: check
alpha-masks, z-depth and normal-maps: check
built-in hotspot system: check
pumped up excitement: check
progress in my own game: de-checked, sigh. I’m buried in 3D printer woes, but hey, maybe I can print up some Asylum tchotchkes :slight_smile:

BTW, are those alpha-masks binary or gray-scale? Will the alpha-map layer in PNGS be supported? Come to think of it, I’m not sure we know the current state of video and texture handling, not so much how imported or referenced, but supported formats and limitations.
Lastly, how are normal maps used in the engine? Normally I create them to render 2D images for the cube-map textures. A normal map on a cube-face would be weird because it would be pointless unless a local light-source in 3D is flying around and then you would get cube seams. Does this imply 3D objects within a node?


Yup, alpha masks are grayscale and PNGs are fully supported. Currently, you can throw anything you want at Unity which will compress the textures pretty much like the previous TEX format did. And Theora videos work just the way they used to.

As for normal maps, it’s rather complex, but yes, we’re able to specify a light source (i.e.: a flashlight) that would make use of the normal maps you give us. We’re not sure how these features will integrate with the Dagon scripts, though, but you can always create these 3D objects directly in Unity. It’s all very streamlined, I promise it will make sense when you test it :slight_smile:


can you use position passes as well?


Thanks for the JSON example, Agustín - far simpler than I assumed, which is always a good thing! I was initially thinking that each item would have a corresponding hex value for the colour in the JSON (theoretically allowing up to 16,777,216 different interactive items). e.g.:

{hotspots:[ {"id":0,"color":"000000","item":"Janitor"}, {"id":1,"color":"0000ff","item":"Gloves"}, {"id":2,"color":"00ff00","item":"Rope"}, {"id":3,"color":"ff0000","item":"Brush"} ]}

Is it the case that the colours are hard coded to those exact colour values seen in the screenshot? If so, does that mean there’s a limit of 20 interactive items per room (of multiple nodes), or can more colours be easily configured? 20 items are plenty but there may be edge cases where more are needed.

Actually, thinking about that, will setting up hotspots via coding coordinates still be in place? This could supplement such cases where more than 20 interactive objects may be needed for a room/node.


Indeed, colors are hardcoded for now, and eventually we will provide a mechanism to define more colors as required. But like you say, 20 interactive items per location is plenty, and you can always manually define hotspots as before :slight_smile:


Pretty cool Agustín - good to hear both colour-based and the manual hotspot creation complement each other. 8)

The JSON for working with more colours could actually be even simpler than I was originally thinking (I overthunk! :slight_smile: ):

	"008000":"Light switch",
	"008080":"Trash container",

The unique identifier for hotspots in the JSON could be it’s actual hex value instead of 1-20.

Anyway, can’t wait to try this out when it’s ready!


If I can get the specifics about what and how the engine uses hotspots I can (attempt) to write a script for Blender, especially since the hotspot addon I made is now completely obsolete :frowning:
Fortunately the cube-map maker addon is still relevant. :slight_smile:


I’ll be sharing specifics soon. And hey, we’re still supporting manual creation of spots, so anything you created is still usable :nod: