Possible to access resources outside of systems?

(Nolan) #1

I’m building my own audio system based on Alto and am hitting some snags. Alto uses a Context trait for creating sources, buffers, etc. via calls like context.new_static_source(...). I’m creating and adding the context as a resource like so:

    fn setup(&mut self, res: &mut Resources) {
        Self::SystemData::setup(res);
        let alto = Alto::load_default().unwrap();
        let device = alto.open(None).unwrap();
        let context = device.new_context(None).unwrap();
        res.insert(context);
    }

And I can successfully access it from systems. But I’d like to access it from elsewhere. In particular, it’d be great if I can pass it to AssetLoader so it propagates to SimpleFormat instances via their options, and I can call context.new_buffer(...) and store assets directly as Alto buffers. Unfortunately, this passes back a Read which requires lifetime parameters, and I’m not sure if I can introduce those into my own code. Here’s my attempt at trying to do this with my SimpleFormat instance:

#[derive(Clone)]
pub struct OggFormat<'a> {
    phantom: PhantomData<&'a Context>,
}

impl<'a> SimpleFormat<Audio> for OggFormat<'a> {
    const NAME: &'static str = "OGG";

    type Options = (&'a Context);

    fn import(&self, bytes: Vec<u8>, (context): (&'a Context)) -> Result<Audio, Error> {
...

This gives:

error[E0495]: cannot infer an appropriate lifetime for lifetime parameter `'a` due to conflicting requirements
  --> src/audio.rs:44:10
   |
44 | impl<'a> SimpleFormat<Audio> for OggFormat<'a> {
   |          ^^^^^^^^^^^^^^^^^^^
   |
note: first, the lifetime cannot outlive the lifetime 'a as defined on the impl at 44:6...
  --> src/audio.rs:44:6
   |
44 | impl<'a> SimpleFormat<Audio> for OggFormat<'a> {
   |      ^^
   = note: ...so that the types are compatible:
           expected amethyst_assets::asset::SimpleFormat<audio::Audio>
              found amethyst_assets::asset::SimpleFormat<audio::Audio>
   = note: but, the lifetime must be valid for the static lifetime...
note: ...so that the type `&alto::al::Context` will meet its required lifetime bounds
  --> src/audio.rs:44:10
   |
44 | impl<'a> SimpleFormat<Audio> for OggFormat<'a> {
   |          ^^^^^^^^^^^^^^^^^^^

So I guess I’m on the wrong track here. Any tips on how I can go about sharing this Context instance? Are there any smart pointer types that might help?

0 Likes

(Théo Degioanni) #2

Formats cannot use external references because they will be used for as long as the asset needs to load, which doesn’t have a statically determined lifetime.

What you are trying to achieve is unsafe because asset loading is done in parallel, so you cannot know if the context isn’t being used when the asset is being loaded. The only thing you can do is monitor asset loading on the main thread using a ProgressCounter for example, and register it once it is loaded.

0 Likes

(Nolan) #3

Got it. Is there a recommended pattern for something like this, where I load an asset into raw bytes but then want to convert it to something my code will use directly?

Also, is there any way to access a resource outside of a system? Or are they only meant for use within systems?

Thanks.

0 Likes

(Théo Degioanni) #4

You can create a custom asset type “AudioRaw” or something like that. It would describe the audio data in a raw format that you would then process the way you want it. Potentially, you could even pass that AudioRaw asset to another AssetLoader that would convert it to what you want (or not).

You can access resources either from a system or from anywhere you have access to a World instance. To do the latter, you can use the World::exec method like as follows:

world.exec(|(my_res, my_comps): (Read<MyResource>, ReadStorage<MyComponent>)| {
    // Do stuff with the resources and components
});

The closure’s parameter can be any SystemData. The closure will be executed immediately.

1 Like

(Nolan) #5

So there are a couple different issues here:

  1. Access to resources outside of systems.
  2. Access to resources while loading assets, and asset-processing in general.

I’m going to focus on #1 at the moment because it has more immediate implications for the system I’m designing.

I added the following dummy test method to my AudioEmitter:

impl AudioEmitter {
    pub fn new(context: Context) {
    }

I imagine I’ll get a reference but I’m not sure. I then added this to a method with access to a world:

        world.exec(|(context): (ReadExpect<Context>)| {
            AudioEmitter::new(&context);
        });

Yeah, I’m passing a reference. Ignore that inconsistency for the moment.

This code gives:

error[E0308]: mismatched types
   --> src/main.rs:326:31
    |
326 |             AudioEmitter::new(&context);
    |                               ^^^^^^^^ expected struct `alto::al::Context`, found reference
    = note: expected type `alto::al::Context`
               found type `&shred::res::data::Read<'_, alto::al::Context, shred::res::setup::PanicHandler>`

Getting rid of the reference doesn’t change much, just expects a Context but instead gets shred::res::data::Read…

I’m using ReadExpect because Context doesn’t implement Default. Do I need to unwrap the type somehow? .unwrap() doesn’t work.

Also, how is exec different from something like this?

        let cockpit = {
            let storage = world.read_resource::<AssetStorage<Audio>>();
            let sounds = world.read_resource::<Sounds>();
            storage.get(&sounds.cockpit).unwrap().clone()
        };

I used a similar pattern to try getting a Context initially, but that threw something like 20 pages of compiler error message. I can trigger that again if it’d be useful to have a snippet of those. :slight_smile:

Thanks for all the help.

0 Likes

(Théo Degioanni) #6

The Context resource is owned by the ECS. You cannot move it out of the ECS, this would be unsafe. exec gives you a reference to the resource in the shape of a ReadExpect. Do everything you need to do with the Context while in the closure, but do not move it out because that’s unsafe.

Exec saves you from a trillion error messages by making the scope of the references explicit. They effectively are used the same way, but not using exec opens the door to more lifetime misunderstandings.

0 Likes

(Nolan) #7

Thanks, that helps. So given I have this Context that I’m storing in a resource, and given that I need to create Buffer instances for assets and Source instances for components using this context instance as an initializer, what strategies do I have for doing that?

Right now I have an AudioEmitter component. I was initially tracking its associated sources in the AudioSystem, but this made it hard to start and stop sounds in contexts where someone may only have access to the emitter component. So I did something like this to move source creation into the system, where the context is available:

#[derive(Clone, Default)]
pub struct AudioEmitter {
    sources: HashMap<String, Arc<Mutex<StaticSource>>>,
    new_sources: Vec<(String, Audio, bool)>,
}

impl AudioEmitter {
    pub fn insert<S: Into<String>>(&mut self, name: S, audio: Audio) {
        self.new_sources.push((name.into(), audio, false));
    }

    pub fn insert_and_play<S: Into<String>>(&mut self, name: S, audio: Audio) {
        self.new_sources.push((name.into(), audio, true));
    }

    pub fn set_loop<S: Into<String>>(&mut self, name: S, should_loop: bool) -> Result<(), Error> {
        let mut source = self.sources.get_mut(&name.into())
            .expect("Source not found")
            .lock().unwrap();
        source.set_looping(should_loop);
        Ok(())
    }
}

impl<'s> System<'s> for AudioSystem {
    type SystemData = (
        Entities<'s>,
        Read<'s, AssetStorage<Audio>>,
        ReadStorage<'s, Transform>,
        ReadExpect<'s, Context>,
        ReadStorage<'s, AudioListener>,
        WriteStorage<'s, AudioEmitter>,
    );

    fn run(
        &mut self,
        (entities, assets, transforms, context, listener, mut emitters): Self::SystemData
        for (entity, mut emitter) in (&entities, &mut emitters).join() {
            while let Some((name, audio, autostart)) = &emitter.new_sources.pop() {
                let buffer = match &audio.channels {
                    1 => {
                        Arc::new(context.new_buffer::<Mono<i16>, _>(&audio.bytes, audio.sample_rate).unwrap())
                    },
                    2 => {
                        Arc::new(context.new_buffer::<Stereo<i16>, _>(&audio.bytes, audio.sample_rate).unwrap())
                    },
                    _ => panic!("Unsupported channel count"),
                };
                let mut source = context.new_static_source().unwrap();
                source.set_buffer(buffer).unwrap();
                if *autostart {
                    source.play();
                }
                let source = Arc::new(Mutex::new(source));
                emitter.sources.insert(name.to_string(), source);
            }
...

The problem is that, while calling AudioEmitter::set_loop() should work when the source is inserted, it won’t work until the system has had at least one run iteration to do the insert. So initializing an emitter like this won’t work currently:

        let mut emitter = AudioEmitter::default();
        emitter.insert_and_play("cockpit", cockpit);
        emitter.set_gain("cockpit", 0.5)?;
        emitter.set_loop("cockpit", true)?;

The only thing I can think to do is to make the new_sources variable of the component something like:

struct Details {
    should_loop: bool,
    gain: f32,
    ...
}

Then each method on the emitter can check whether the given source hasn’t yet been added and update the Details instance if found, or update the Source directly if it isn’t pending. The system would then check this Details object and do the initial setup based on that.

Is this really the best way to design this? I guess all the complexity will be hidden from the end user, but I’m wondering if I’m missing something obvious.

And I guess this dovetails a bit with my asset issue. Currently I’m creating a new buffer for each asset handle. I need the context to create buffers, so I suppose I could map handles to buffers in the system. But then I’ve got both raw bytes and buffers in memory for each sound, and there may be a delay between a sound’s first use as the buffer allocates. I don’t begrudge Rust/Amethyst for making me jump through these hoops in the name of safety, I just wonder if I’m missing some obvious solution to make this simpler. :slight_smile:

Thanks again.

0 Likes

(Kae) #8

While it’s true that asset data is loaded from sources in parallel, there is a processing step that converts from Asset::Data to Asset.

The flow for assets goes something like this:

Source file -> Asset::Data -> Asset

Asset is a trait that defines an associated type for the “intermediate data” - the data that has not been processed by other game systems yet. The transition from source file to Asset::Data is done by implementations of the Format trait. This is done in parallel to the game loop and thus cannot access World. The transition from Asset::Data to Asset is, for simple cases, performed by implementations of Processor, which is a simplified System that only requires you to define the conversion between Asset::Data to Asset as a pure function.

You can, however, implement your own system that calls process on the AssetStorage. An example of this kind of custom asset processor can be found in amethyst_renderer here. I think this would be an ideal thing to do for you.

As far as I can tell, Asset should be implemented for the “Alto-owned” audio buffers, and its Asset::Data should be an AudioRaw thing which is the raw u8 buffers that you intend to pass to Alto. Then, you’d pass a closure to the process function that takes AudioRaw and produces the Alto-owned buffer type.

0 Likes

(Nolan) #9

Thanks, this looks great! I looked at the system you linked, and it looks a bit dense since I’m not familiar with how the renderer works. Is it just an ordinary system that runs a join over some storage? If so, what specifically would it join over for some type Audio that implements Asset, and how do I distinguish a processed asset from an unprocessed asset?

Also, I assume my raw data gets dropped so I don’t have two copies of it around?

0 Likes

(Kae) #10

The Loader loads Asset::Data from Sources (basically from the files), then sends these to the AssetStorages. The AssetStorage buffers the Asset::Data in an internal queue until something calls process on the AssetStorage with a closure that converts from Asset::Data to Asset.

All you need to do is to impl Asset for your audio asset, then create a regular system that calls process on the AssetStorage with a closure for creating Audio from the raw data.

The docs for AssetStorage’s process function is here: https://docs.rs/amethyst_assets/0.6.1/amethyst_assets/struct.AssetStorage.html#method.process

Then you can use the AssetStorage in the same way you would use it for any other asset type.

The raw data gets moved into the closure, so you can decide what you want to do with it there. :slight_smile:

0 Likes

(Nolan) #11

OK, I appear to have it working, just copied code and crossed fingers until it worked. :slight_smile:
I think the piece I’m missing but want to confirm is that:

impl From<RawAudio> for Result<ProcessingState<Audio>, Error> {
    fn from(raw: RawAudio) -> Result<ProcessingState<Audio>, Error> {
        Ok(ProcessingState::Loading(raw))
    }
}

I’m guessing that’s how the asset loader knows that, when it encounters a RawAudio, it isn’t done processing that particular asset? Returning Loaded would end processing, but Loading indicates that the asset isn’t yet ready? At that point I guess the asset’s load gets deferred, at which point the system gets it, returns Loaded, and that asset’s processing terminates?

0 Likes

(Azriel Hoh) #12

Gonna repeat some things you probably already know, but I think the repetition would solidify that knowledge.

Exactly!

Backtracking a bit:

When it encounters the RawAudio – which should be the AssetData in your impl Asset block – it needs to turn that into the Asset type asynchronously.

If the AssetData type is the same as the Asset, that is, impl Asset for MyAsset { AssetData = Self; }, then the Processor system is a simple implementation that just uses Into::into (source), that turns MyAsset into ProcessingState<MyAsset>.

If the AssetData type is something else, for example a impl Asset for MyAsset { type AssetData = Handle<AnotherAsset>; }, then you can write another system that, when you attempt to retrieve AnotherAsset from AssetStorage<AnotherAsset>, if it returns None, then you return ProcessingState::Loading(another_asset_handle), while waiting for AnotherAsset to be loaded, before you can load MyAsset.

0 Likes