Exploring possible UI toolkits for the editor

(Ellie Fagerberg) #1

(This post is missing links to stuff. Apologies! If you’re a moderator, feel free to add them where applicable)

There has been lots of talks lately about what UI toolkit we could use long term for the editor. Choosing a UI toolkit is a complicated matter as it not only has to be capable enough to fit all of our needs but it also needs to be compatible on a technical level. There has already been a lot of bikeshedding going around this topic and this thread is supposed to summarize most of what has already been talked about. Please remember that we’re a respectful community and trash talking technologies is never acceptable.

The basic requirements

  • The toolkit has to have basic components or at the very least facilitate very easy creation of those components.
  • The toolkit needs to have easy styling options so that we can match the theme of the editor with the Amethyst branding guidelines.
  • The toolkit needs to work with Vulkan in order for us to be able to embed the game output in the editor as our new renderer will basically require Vulkan for the foreseeable future.

That last point is of particular interest as a the only way to have a Vulkan surface display on OpenGL is with OpenGL 4.6, support for which isn’t yet widespread at the time of writing. We may be able to use the OpenGL backend in gfx hal, but that will still be far from optimal.

The toolkits

Electron

A prototype has been built already with Electron but electron by randomPoison. The prototype is currently able to communicate with the engine in order to sync and modify state. It’s currently missing any kind of graphical scene editing which is where the Vulkan <-> OpenGL interop becomes a problem again. Electron is also widely known for taking up a lot of resources. The upsides of using electron would be that it’s very accessible because it uses web technology.

GTK

The upside of GTK is that it would provide a native look and feel on most Linux desktop environments but that’s basically where the upsides end. There is no point using it if the intention is to theme the editor to the Amethyst branding and we still have the graphics interop problem. The bindings for Rust are also immature.

Qt

Qt is pretty much the only toolkit that satisfies all requirements since it has a Vulkan backend and can be easily themed. The main problem with Qt however is that it’s heavily tied to C++. There has been binding generators written for Qt that allows Rust code to essentially drive the logic of a C++ Qt application but it’s unclear if this would actually be a reasonable approach for a primarily Rust based community such as the Amethyst community.

Amethyst

Amethyst is currently undergoing a rewrite of its UI which could potentially be used to create the editor. The problem is that Amethyst is made for games and not UI applications and as such could eat up a lot of resources unnecessarily while also being cumbersome to use because of the differing paradigms. The upsides are that we have guaranteed compatibility with the engine, we have a good way to dogfood the UI side of the engine and contributors of Amethyst will immediately be familiar with the editor.

Flutter

The desktop side of flutter is somewhat up in the air being only unofficially supported and even if we could get it up and running it would probably have the graphics interop problem.

Azul

Azul is probably the most interesting technology being written completely in Rust and running on top of webrender. If we could somehow get it to run on the rumored gfx hal webrender branch that apparently exist somewhere, this could be the toolkit that checks all the boxes without any major downsides. The only potential problem is that it’s still immature but I imagine that we could probably dogfood it quite a bit.

Others?

If you have any other toolkit that you know of that could do everything we need it to, don’t hesitate to tell us about it!

2 Likes

#2

Orbtk also was mentioned, and it doesn’t seem like a bad option, since is a sub project of the Redox OS.

1 Like

(Tatsuyuki Ishi) #3

I would add Xamarin.

Xamarin is also a framework that utilizes fully native widgets, which, unlike style mimicked widgets, also behaves correctly when it comes to fine differences. Like GTK, this could be either an upside or downside. (By the way, it has platform agnostic OpenGL support.)

Xamarin started as a cross platform framework for mobile apps; however, today there are in-progress backend implementations for WPF (which is Win32 not UWP), macOS, and GTK#.

Xamarin (actually Xamarin.Forms) runs on a MVVM architecture which is what is influencing the popular frontend frameworks today. This can be considered a difference to classic toolkits where you update widget states manually.

Xamarin only runs on .NET Platform, which include C#, F# or Visual Basic. C# is a very popular language, making it accessible to many people. Meanwhile, Xamarin also provides low-level bindings to the platform’s native framework, allowing fine-tuned custom behaviour whenever it’s needed.

0 Likes

(Fletcher) #4

I will throw out https://haxe.org/ and http://haxeui.org. Lesser known, but used to great effect in games like Dead Cells.

0 Likes

(Joshua) #5

Some more minimalistic and basic ui toolkits: https://github.com/Gekkio/imgui-rs or https://github.com/snuk182/nuklear-rust

0 Likes

(Ellie Fagerberg) #6

When you comment about other UI toolkits, please include information about how you think that technology is gonna be able to solve the problems brought up in the original post.

0 Likes

(Erlend Sogge Heggen) #7

Frankly speaking I think there’s very little value to be gained from this discussion. You are approaching this as if Amethyst is a company that can assign someone to do the work once the best tool for the job has been decided. In reality it doesn’t matter if 10 people think Option X is the best way to go if none of them can commit to doing the work. All it takes is a single developer who believes strongly enough in Option Y to do the work, and suddenly a decision has been made.

Rust GUI development is still in its infancy: https://github.com/pythoneer/areweguiyet/wiki
In a year from now you might have completely different frameworks at your disposal. By far the best way to get any answers is to do like @randomPoison and make prototypes. Thanks to their work you are starting to figure out how to make Amethyst play nice with any kind of editor instead of a single blessed one, which further bolsters the notion that perhaps Amethyst is uniquely positioned for a hybrid framework/engine approach.

For that reason I do think it’d be great if there was at least one other serious Editor prototype being made, since that will ensure the work that’s going into amethyst-editor-sync is not accidentally biased towards the only editor it needs to support at the moment.

1 Like

(Erlend Sogge Heggen) #8

Speaking of being front-end agnostic, I think the xi-editor lays forth an excellent example of what that might look like in practice. From the readme:

Frontends

Here are some front-ends in various stages of development:

  • xi-mac, the official macOS front-end.
  • fuchsia/xi, a front-end in Flutter for Fuchsia.
  • xi-gtk, a GTK+ front-end.
  • xi-term, a text UI.
  • xi-electron, a cross-platform front-end based on web-technologies.
  • gxi, a GTK+ front-end written in Rust.
  • xi-win, an experimental Windows front-end written in Rust.
  • kod, a terminal frontend written in Golang.
  • xi-qt, a Qt front-end.

The following are currently inactive, based on earlier versions of the front-end protocol, but perhaps could be revitalized:

There are notes (I wouldn’t call it documentation at this point) on the protocol at frontend.md. If you’re working on a front-end, feel free to send a PR to add it to the above list.

Many of these alternative front-ends for xi are probably the closest thing Amethyst has right now to Editor prototypes made with differing frameworks and (cross-)platform strategies. Might even be possible to hack around with a fork of one of these and attempt to inject some Amethyst editing features into the existing UI. If nothing else it’s a good way to learn a bit about what it’s like to work with that framework.

0 Likes

(Ellie Fagerberg) #9

I would say there’s quite a bit to be gained from this conversation even if you don’t agree. There are very serious technical challenges that need to be solved and hopefully agreed upon in order to have an editor that is future proof enough to not just stay a prototype. I think there’s been enough bike shedding already about this topic and that’s why I made this thread so that we could take a step back and have a more objective approach
with bringing up different alternatives for the editor. Please don’t bring up meta discussions around this topic on this thread again, that can be kept on Discord.

Me and randomPoison have already been discussing a bit about the problems with their approach and while they certainly could be solved, my fear is that it’s gonna end up being solved with a mountain of hacks which is mainly what I think this discussion can solve. Being front-end agnostic sounds nice but is probably not gonna work out for us as it did for Xi, mainly because the editor isn’t gonna be as simple as a text editor.

You also mentioned that I act like Amethyst is a company that can just throw people at problems. That is true to some extent, mainly because the person I’m gonna throw at the problem is myself. If anyone else wants to help, I would of course appreciate that, but it’s not something I’m counting on.

1 Like

(doomy) #10

I understand your sentiment but I disagree. I find Xi’s frontend agnosticism interesting, but in reality it just makes for a fragmented community, with the official Mac app having the majority of support. Additionally, the technology matters if we want to keep Amethyst as an easy setup.

My thoughts: I’m not too keen on using GTK/QT/Xamarin/Electron or anything that requires an external dependency, or different language. Not only can this make compiling difficult (especially for beginners), but Amethyst is a Rust engine; I imagine many of us use it because we like how Rust solves problems.

Being said, Nuklear is alright since it’s just a header file IIRC, and almost every machine already has a C compiler. I also am a fan of Azul (being pure Rust) though since it’s less mature developers may have to spend some time with it - however, I think it’s the most promising Rust GUI library, and a major use of it (with Amethyst) would be awesome for the Rust community as a whole.

I also haven’t really seen much of OrbTK but I’d be interested in hearing more about it (though afaik it also relies on SDL2)

0 Likes

(Erlend Sogge Heggen) #11

Oh! Yeah I must concede I did not take this possibility into account. I apologize for coming on a bit strong and nearly hijacking the topic. :bowing_man:

Coming back to the topic at hand, Azul (based on WebRender, very web-y) and OrbTk (which uses an ECS and names Flutter as a key inspiration) are the most interesting to me, but right now they are still in the pre-release phase. I’m gonna reach out to the developers and see if they might be excited about the prospect of the Amethyst Editor as an early adopter of their GUI framework.

2 Likes

(Florian Blasius) #12

Hey @all,

I’m the maintainer of OrbTk and member of the Redox OS project. erlend_sh made me aware of this discussion.

As erlend_sh mentioned OrbTk (0.3) is inspired by Flutter / React and is based on an ECS library.

Here you can find more information about OrbTk:

OrbTk could be a good opportunity for the Amethyst editor. It’s also written in Rust, uses ECS, should be easy to use, you could define a custom theme and it works on many platforms.

OrbTk is based on OrbClient. It uses SDL2 for Window handling and drawing (except of Redox OS, there we uses a custom implementation). But we are thinking about to switch to glutin, but gfx-rs could be also possible. If we reached that point, OrbTk should fulfill your basic requirements.

Maybe there is an opportunity to work together. You could help us to implement a Vulkan based solution for OrbTk / OrbClient and we can help you to implement the features in OrbTk you need to create your editor.

I guess we can reach a version of OrbTk to start with the editor within half a year.

What do you think about it?

4 Likes

(Fschutt) #13

Hi,

well, erlend_sh made me aware of this discussion (I’m the maintainer of Azul). Of course, I’m biased, so take everything I say with a grain of scepticism. Originally I wrote Azul to write a GIS / map editor (screenshot below):

So that should give you an impression of what Azul is capable of right now. In terms of styling, I think it beats other systems (since it’s webrender-based, I didn’t have to re-implement things like gradients or stuff like that). The lines in the node graph are for example drawn via an SVG component (which uses an OpenGL texture + lyon underneath), everything else is webrender. The UI is ~400 lines of CSS, 400 lines of callbacks (to handle focusing text fields - yes, this currently has to be done manually), and ~700 lines for DOM construction, split across many functions. Most of the callbacks are for adding and removing nodes / connecting nodes from the node graph, etc. - things that need to be done, but aren’t necessarily nice to code. The editor takes ~60MB of RAM, and a few % of CPU time.

Here’s for example what the code for dragging the node graph looks like:

… and the DOM construction for the top toolbar:

This should at least give you an idea on how real-world code would look like. There is some repetition to coding callbacks (esp. since Rust currently doesn’t have a custom ? operator, so that’s why that inner function exists), there are clipping issues, scrolling issues, etc. However, overall - at least for me it works fairly well. For text input, I am currently simply working with a text.push() / text.pop() system - i.e. the most basic type of text input you could imagine. However, I have for example implemented selection and cursor support for the TextInput component, so this should give an impression on how easy or hard it is to code “components” in Azul.

I don’t know if Azul is the right choice for amethyst - but if you want to use Azul for developing a game engine, please render the game in a seperate window. Which would solve the “Vulkan, DirectX, etc.” issue - you can render whatever API you want in your window and communicate between the editor and the game via a channel or via IPC. And for shipping the game, you simply disable the editor and let the game run as a standalone executable. You don’t need to port webrender to every backend API. And don’t use Azul for rendering the in-game UI, this would destroy performance (and lead to bugs because Azul may not clean up OpenGL state). Azul is currently too slow to be injected into a game loop, plus it is retained-mode (i.e. it only redraws when necessary), so it would be good to use for the editor UI, but not the game UI.

At least in my experience, it makes little sense to bikeshed which UI framework to use - try to build a prototype instead. Experiment. See how far you get with Azul, OrbTk, GTK, Qt, etc. in a weekend and then you’ll have a better understanding of which framework fits your usecase better. I.e. just build the most basic game editor you can imagine, try to render a list of game objects, a tree view, the actual game, a list of assets, a code editor, etc. And then record how long it takes to code each component and see which framework is best. Then, and only then you’ll be able to estimate how long it’ll take for each framework implement a game engine editor in. You can argue about the merits and drawbacks of each approach until the cows go home, but that won’t lead to anything in the end. Often you only run into issues when you’re building it, so my best advice is to just build a simple prototype of the same UI in each framework, that’s really all I can say.

In your position, I’d wait a year or so and try to focus on other issues - unless you are set to use GTK or Qt, there is currently no “mature” UI toolkit in pure-Rust.

6 Likes

(Marco Alka) #14

While there are a lot of really nice toolkits out there, if the editor should be pure Rust anyway, I think using the Amethyst_UI itself would be very beneficial. Not only would it provide a GUI for the editor, it would also be built on the Amethyst engine, so it’s fully integrated and it would definitely push the development of the UI crate a lot. At the moment, you guys are talking about creating a showcase game for Amethyst in order to explore shortcomings when creating a real game from scratch. So the Editor could be just another such application and help improve the Amethyst GUI library in the process (instead of fixing another unrelated library).

As for the downside of Amethyst_UI not being focused on UI applications, I don’t even think that’s a big problem. It might end up being a little more effort, because all the textures have to be provided, effects have to be written, etc. however isn’t that just what UI creators in general will face? Maybe it will help to improve API ergonomics a great deal, and that’s what might be another plus for Amethyst, again~

4 Likes

(Fletcher) #15

I’m not sure the UI library is in a state we can do this. @jojolepro could give a better opinion. There’s also something to be said for decoupling the editor from the engine and having the engine publish an interface anyone can rely on. For example, an iPad editor or something.

1 Like

(Dobkeratops) #16

building a rust UI toolkit with an editor as you go along seems like a good idea. It seems like GUI’s are heavily tied to languages; using anything through inter-language bindings is going to be unpleasant.

another potential benefit of a custom toolkit is adapting it to a HMD/VR/AR interface , think of a UI toolkit heavily integrated with a 3d engine…

1 Like

(Richard Dodd (Dodj)) #17

I came to this forum specifically to talk about the UI library, so I think it’s great that this discussion is happening.

Personally, I would strongly push for a UI framework & editor written in amethyst. As I mentioned on discord, there are some great resources for designing a UI engine (e.g. layout in flutter), and so the hard work is actually implementing these solutions.

I think this could have been said a year, or even 2 years ago. I think amethyst as a big enough community that it can push forward the situation for the whole of rust here. If amethyst implements a really simple effective backend, it will then be modularized and consumed by the rest of the rust community.

That being said, here is my design proposal for the rust UI engine I’ll go from how you describe the UI down to how you color the pixels.

UI design

Text format

I actually think ron is a great format for describing UIs. It ties in with rust data-structures really nicely, and is easy enough for humans to read. With serde, it’s easy to swap data formats anyway if it is decided that something else is better.

UI Structure

Taking inspiration from flutter (and all other UI toolkits), I would describe a UI as a tree of components, where each parent component owns its children. A component can decide whether to accept children or not.

I propose a simple solution for styling components - every component (node in the graph) can have style for that component, and a style for both it and its children. Style just for the component always wins, then style for it and its children, otherwise style comes from the most recent parent. Each property is overwritten individually - so border-radius may be overridden by the parent, but color might come from the grandparent. A stack can be maintained during the layout phase to pass style information to all components, that they then use during the draw phase. Style doesn’t include size information like width and height that comes from the layout.

Where we draw things

In flutter, a component can draw both underneath and on top of its children.

Layout is determined by a simple algorithm: A single depth-first recursive descent of the tree, where each component tells its children what their constraints are, and they report back their size. Each component can visit its children in any order it chooses, e.g. the fixed sized components, then the flexibly sized ones. Sometimes a component can reason that the layout of its children can’t change, but we can ignore this to start with for simplicity.

Actually drawing them

Once we know each element’s size, we do a depth-first recursive descent asking each component to draw itself. To start with I propose that we draw everything into one layer on the GPU. Once we have everyone’s pixels rendered, we composite them over the 3D stuff, in order to get the final image.

Converting primitives to draw calls

Some code needs to be written to translate draw calls from the widgets to draw commands for the GPU. This is where a library like lyon can be used to tesselate 2D shapes like line, square, etc into primitives and colors/textures. Thus we can support 2d svg-style shapes (using lyon) and anything that renders itself (by just compositing directly). You could even do render-to-texture 3D scenes that are rendered onto the UI - I would use this to create a widget to orient yourself in debug mode (like you get in e.g. blender).

Widgets

Then once we had this infrastructure it would be easy to build widget abstractions like buttons, text areas, scroll areas etc. and widgets for layout like flex.

Text

Text is quite hard in its own right, but there are good libraries out there to help, like harfbuzz. Here is some info on rendering text correctly. This should probably be seen as a separate problem to layout (the problem is: given an area and some text, layout the text correctly in the area, as best you can).

Implementation

If people like this, I’ve got a while between paid jobs to work on it.

The first step is to work out how to draw and composite things (how to use lyon, modify the UiPass), then write the layout algorithm, and finally write widgets. There also obviously needs to be event processing, but I think this already works pretty well in amethyst. It would be good to make everything as decoupled as possible, maybe having a pluggable layout algorithm using the visitor pattern, but this can come later.

I probably will fail if I try to do this on my own though so I would need to get buy-in from other members of the community.

Stretch goals

Stretch goals are to implement the caching in flutter, that is layout breaks, layer breaks, and render-to-texture for layers. They’re probably not such an issue on desktop, but important for mobile, however they can be added in after we have a MVP.

What do people think? If this is successful (a fast and efficient way to draw UI on the GPU) then the code will be useful well beyond amethyst, but this seems like a good place to do the work.

There’s also the issue of how your 2D interface interacts with a 3D world. The best way here is probably to render your 2D solution to a texture, then render that into the 3D scene. This could be done after the initial 2D engine was finished. You could also do more complicated things, like maybe different parts of the UI are at different heights like a hologram, or also maybe the UI is animated like it is part of the 3D scene (it moves with the player’s wrist or something), but again these can be tackled as post-processing on the 2D solution.

3 Likes

(Fletcher) #19

@happenslol @jojolepro what are your thoughts on this, especially around the work already done in the UI library?

0 Likes

(Richard Dodd (Dodj)) #20

Happy to contribute to an alternate solution if that’s what we decide btw :wink:

0 Likes

(Fletcher) #21

Oooo you probably shouldn’t have said that… =)

Also, @randomPoison what are your thoughts?

@dodj Right now, we have three needs: In-game UI for the user, an editor for developers to use to make a game, and a prefab editor. The in-game UI library is currently difficult to work with. @dodj Their Trello is here: https://trello.com/b/pKAOAUdn/ui-team if you are curious about what they are working on.

@randomPoison has been working on an Electron-based editor, which can be seen here: https://github.com/amethyst/amethyst-editor

In (my) ideal world, we would dogfood our own UI library by making the editor(s) out of it. My question then becomes, should we start a new one, or contribute to Azul, OrbTK, or something like this: https://github.com/snuk182/nuklear-rust?

Figuring all this out, to the point of having a coherent plan, is something we need to have within the next couple of weeks, since it will be a big part of our 2019 announcement. @dodj I’m specifically interested in your thoughts on starting a new crate, or contributing to an existing one?

0 Likes